Apr 06 11:56:59 crc systemd[1]: Starting Kubernetes Kubelet... Apr 06 11:56:59 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:56:59 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Apr 06 11:57:00 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Apr 06 11:57:01 crc kubenswrapper[4790]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.397241 4790 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399753 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399770 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399774 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399778 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399783 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399787 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399791 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399796 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399799 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399808 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399812 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399815 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399836 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399841 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399846 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399850 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399854 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399858 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399861 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399865 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399869 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399873 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399877 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399880 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399884 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399888 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399891 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399895 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399898 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399902 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399905 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399908 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399912 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399915 4790 feature_gate.go:330] unrecognized feature gate: Example Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399919 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399922 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399933 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399938 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399943 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399949 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399953 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399958 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399963 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399968 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399972 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399976 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399979 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399983 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399986 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399990 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399993 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.399997 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400000 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400004 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400007 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400010 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400014 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400017 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400021 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400024 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400028 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400031 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400035 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400038 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400042 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400045 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400048 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400052 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400055 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400060 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400064 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400138 4790 flags.go:64] FLAG: --address="0.0.0.0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400146 4790 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400153 4790 flags.go:64] FLAG: --anonymous-auth="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400158 4790 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400164 4790 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400168 4790 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400173 4790 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400178 4790 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400183 4790 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400187 4790 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400191 4790 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400196 4790 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400200 4790 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400204 4790 flags.go:64] FLAG: --cgroup-root="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400208 4790 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400212 4790 flags.go:64] FLAG: --client-ca-file="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400217 4790 flags.go:64] FLAG: --cloud-config="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400221 4790 flags.go:64] FLAG: --cloud-provider="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400224 4790 flags.go:64] FLAG: --cluster-dns="[]" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400229 4790 flags.go:64] FLAG: --cluster-domain="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400233 4790 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400237 4790 flags.go:64] FLAG: --config-dir="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400242 4790 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400247 4790 flags.go:64] FLAG: --container-log-max-files="5" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400252 4790 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400256 4790 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400260 4790 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400264 4790 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400268 4790 flags.go:64] FLAG: --contention-profiling="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400273 4790 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400277 4790 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400281 4790 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400285 4790 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400290 4790 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400294 4790 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400298 4790 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400302 4790 flags.go:64] FLAG: --enable-load-reader="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400306 4790 flags.go:64] FLAG: --enable-server="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400310 4790 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400315 4790 flags.go:64] FLAG: --event-burst="100" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400319 4790 flags.go:64] FLAG: --event-qps="50" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400323 4790 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400327 4790 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400332 4790 flags.go:64] FLAG: --eviction-hard="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400336 4790 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400341 4790 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400345 4790 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400349 4790 flags.go:64] FLAG: --eviction-soft="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400353 4790 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400357 4790 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400361 4790 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400365 4790 flags.go:64] FLAG: --experimental-mounter-path="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400369 4790 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400373 4790 flags.go:64] FLAG: --fail-swap-on="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400381 4790 flags.go:64] FLAG: --feature-gates="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400387 4790 flags.go:64] FLAG: --file-check-frequency="20s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400393 4790 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400397 4790 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400401 4790 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400406 4790 flags.go:64] FLAG: --healthz-port="10248" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400410 4790 flags.go:64] FLAG: --help="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400414 4790 flags.go:64] FLAG: --hostname-override="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400418 4790 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400422 4790 flags.go:64] FLAG: --http-check-frequency="20s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400426 4790 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400430 4790 flags.go:64] FLAG: --image-credential-provider-config="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400434 4790 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400438 4790 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400442 4790 flags.go:64] FLAG: --image-service-endpoint="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400446 4790 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400450 4790 flags.go:64] FLAG: --kube-api-burst="100" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400454 4790 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400458 4790 flags.go:64] FLAG: --kube-api-qps="50" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400462 4790 flags.go:64] FLAG: --kube-reserved="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400466 4790 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400470 4790 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400474 4790 flags.go:64] FLAG: --kubelet-cgroups="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400478 4790 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400482 4790 flags.go:64] FLAG: --lock-file="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400486 4790 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400490 4790 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400494 4790 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400500 4790 flags.go:64] FLAG: --log-json-split-stream="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400504 4790 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400508 4790 flags.go:64] FLAG: --log-text-split-stream="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400512 4790 flags.go:64] FLAG: --logging-format="text" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400517 4790 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400521 4790 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400525 4790 flags.go:64] FLAG: --manifest-url="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400529 4790 flags.go:64] FLAG: --manifest-url-header="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400535 4790 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400539 4790 flags.go:64] FLAG: --max-open-files="1000000" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400544 4790 flags.go:64] FLAG: --max-pods="110" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400548 4790 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400553 4790 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400557 4790 flags.go:64] FLAG: --memory-manager-policy="None" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400560 4790 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400565 4790 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400569 4790 flags.go:64] FLAG: --node-ip="192.168.126.11" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400573 4790 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400582 4790 flags.go:64] FLAG: --node-status-max-images="50" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400586 4790 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400590 4790 flags.go:64] FLAG: --oom-score-adj="-999" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400594 4790 flags.go:64] FLAG: --pod-cidr="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400598 4790 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400604 4790 flags.go:64] FLAG: --pod-manifest-path="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400608 4790 flags.go:64] FLAG: --pod-max-pids="-1" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400612 4790 flags.go:64] FLAG: --pods-per-core="0" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400616 4790 flags.go:64] FLAG: --port="10250" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400620 4790 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400624 4790 flags.go:64] FLAG: --provider-id="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400628 4790 flags.go:64] FLAG: --qos-reserved="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400632 4790 flags.go:64] FLAG: --read-only-port="10255" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400636 4790 flags.go:64] FLAG: --register-node="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400640 4790 flags.go:64] FLAG: --register-schedulable="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400645 4790 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400651 4790 flags.go:64] FLAG: --registry-burst="10" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400655 4790 flags.go:64] FLAG: --registry-qps="5" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400659 4790 flags.go:64] FLAG: --reserved-cpus="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400664 4790 flags.go:64] FLAG: --reserved-memory="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400675 4790 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400679 4790 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400684 4790 flags.go:64] FLAG: --rotate-certificates="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400688 4790 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400691 4790 flags.go:64] FLAG: --runonce="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400695 4790 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400699 4790 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400704 4790 flags.go:64] FLAG: --seccomp-default="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400708 4790 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400712 4790 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400716 4790 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400720 4790 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400724 4790 flags.go:64] FLAG: --storage-driver-password="root" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400729 4790 flags.go:64] FLAG: --storage-driver-secure="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400733 4790 flags.go:64] FLAG: --storage-driver-table="stats" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400737 4790 flags.go:64] FLAG: --storage-driver-user="root" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400741 4790 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400745 4790 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400749 4790 flags.go:64] FLAG: --system-cgroups="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400753 4790 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400760 4790 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400764 4790 flags.go:64] FLAG: --tls-cert-file="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400767 4790 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400772 4790 flags.go:64] FLAG: --tls-min-version="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400776 4790 flags.go:64] FLAG: --tls-private-key-file="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400780 4790 flags.go:64] FLAG: --topology-manager-policy="none" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400784 4790 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400788 4790 flags.go:64] FLAG: --topology-manager-scope="container" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400792 4790 flags.go:64] FLAG: --v="2" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400799 4790 flags.go:64] FLAG: --version="false" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400805 4790 flags.go:64] FLAG: --vmodule="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400811 4790 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.400816 4790 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400919 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400925 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400930 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400935 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400939 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400943 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400950 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400954 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400958 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400961 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400965 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400969 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400972 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400976 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400979 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400983 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400987 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400990 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400993 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.400997 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401000 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401003 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401008 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401012 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401016 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401019 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401023 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401026 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401030 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401033 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401036 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401040 4790 feature_gate.go:330] unrecognized feature gate: Example Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401044 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401049 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401053 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401057 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401061 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401066 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401071 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401074 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401078 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401082 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401085 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401089 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401092 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401096 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401099 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401103 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401107 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401110 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401113 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401117 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401120 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401124 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401127 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401131 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401134 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401139 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401143 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401147 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401150 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401154 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401158 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401162 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401166 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401171 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401175 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401178 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401181 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401185 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.401190 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.401791 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.411423 4790 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.411450 4790 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411524 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411531 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411535 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411539 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411544 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411548 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411552 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411556 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411560 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411563 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411568 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411571 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411575 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411579 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411583 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411586 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411590 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411593 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411597 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411600 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411604 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411607 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411611 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411614 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411618 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411623 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411626 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411630 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411634 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411637 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411641 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411644 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411648 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411651 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411655 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411658 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411662 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411665 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411668 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411672 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411675 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411679 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411682 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411686 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411689 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411693 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411696 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411699 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411703 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411708 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411712 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411716 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411720 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411725 4790 feature_gate.go:330] unrecognized feature gate: Example Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411729 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411733 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411737 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411741 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411746 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411750 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411756 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411760 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411764 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411769 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411773 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411777 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411782 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411786 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411790 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411794 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411797 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.411803 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411915 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411921 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411925 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411928 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411932 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411935 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411940 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411945 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411949 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411954 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411957 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411961 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411965 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411968 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411972 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411976 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411979 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411984 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411988 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411992 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.411997 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412001 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412005 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412009 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412012 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412016 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412020 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412023 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412027 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412030 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412034 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412037 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412041 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412044 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412047 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412051 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412054 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412058 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412061 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412065 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412068 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412072 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412075 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412079 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412082 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412086 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412090 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412093 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412098 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412102 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412106 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412110 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412114 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412117 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412121 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412124 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412128 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412131 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412135 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412138 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412142 4790 feature_gate.go:330] unrecognized feature gate: Example Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412146 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412150 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412153 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412156 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412160 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412163 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412167 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412170 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412174 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.412177 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.412183 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.413135 4790 server.go:940] "Client rotation is on, will bootstrap in background" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.416014 4790 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.420233 4790 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.420328 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.421886 4790 server.go:997] "Starting client certificate rotation" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.421918 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.422173 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.452337 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.454110 4790 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.454721 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.473799 4790 log.go:25] "Validated CRI v1 runtime API" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.528605 4790 log.go:25] "Validated CRI v1 image API" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.530507 4790 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.543680 4790 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-04-06-11-52-10-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.543716 4790 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.572118 4790 manager.go:217] Machine: {Timestamp:2026-04-06 11:57:01.555025334 +0000 UTC m=+0.542768230 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:81d93d39-022f-479c-bd72-2a6f59eabad5 BootID:981c1c09-3ef4-430f-bdd7-a6309afbd803 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ad:07:3c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ad:07:3c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3d:79:85 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:21:91:82 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:94:fc:a5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c2:99:45 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:25:30:c8:a9:0e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:77:9a:25:a6:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.572529 4790 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.572918 4790 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.591293 4790 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.591521 4790 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.591565 4790 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.591792 4790 topology_manager.go:138] "Creating topology manager with none policy" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.591805 4790 container_manager_linux.go:303] "Creating device plugin manager" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.592406 4790 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.592445 4790 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.593726 4790 state_mem.go:36] "Initialized new in-memory state store" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.593883 4790 server.go:1245] "Using root directory" path="/var/lib/kubelet" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.604560 4790 kubelet.go:418] "Attempting to sync node with API server" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.604622 4790 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.604646 4790 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.604662 4790 kubelet.go:324] "Adding apiserver pod source" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.604678 4790 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.609272 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.609342 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.609363 4790 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.609336 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.609499 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.610373 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.612569 4790 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613729 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613752 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613759 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613766 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613778 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613785 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613793 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613804 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613812 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613820 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613860 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.613867 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.614937 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.615369 4790 server.go:1280] "Started kubelet" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.615414 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:01 crc systemd[1]: Started Kubernetes Kubelet. Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.617212 4790 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.617346 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.618131 4790 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.617357 4790 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.618528 4790 volume_manager.go:287] "The desired_state_of_world populator starts" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.618548 4790 volume_manager.go:289] "Starting Kubelet Volume Manager" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.618520 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.618648 4790 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.619409 4790 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620429 4790 factory.go:153] Registering CRI-O factory Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620458 4790 factory.go:221] Registration of the crio container factory successfully Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620561 4790 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620573 4790 factory.go:55] Registering systemd factory Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620582 4790 factory.go:221] Registration of the systemd container factory successfully Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620602 4790 factory.go:103] Registering Raw factory Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.620615 4790 manager.go:1196] Started watching for new ooms in manager Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.620768 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.620873 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.620556 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.621552 4790 manager.go:319] Starting recovery of all containers Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.622893 4790 server.go:460] "Adding debug handlers to kubelet server" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.635076 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18a3c2949a64a105 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,LastTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637257 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637299 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637310 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637320 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637330 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637342 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637352 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637362 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637373 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637383 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637395 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637404 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637414 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637424 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637433 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637443 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637452 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637462 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637471 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637480 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637490 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637498 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637507 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637516 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637525 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637534 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637603 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637616 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637626 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637636 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637645 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637655 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637665 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637675 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637684 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637693 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637702 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637767 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637783 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637797 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637812 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637843 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637860 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637872 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637905 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637918 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637930 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637941 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637950 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637960 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637971 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637981 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.637995 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638022 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638033 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638043 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638053 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638082 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638097 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638111 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638124 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638137 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638149 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638185 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638197 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638207 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638218 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638229 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638240 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638251 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638263 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638274 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638346 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638359 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638373 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638385 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638396 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638406 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638417 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638428 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638438 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638448 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638458 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638467 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638477 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638487 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638500 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638510 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638523 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638532 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638545 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638556 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638566 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638575 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638590 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638599 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638610 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638619 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638629 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638640 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638649 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638659 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638668 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638684 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638698 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638709 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638721 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638732 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638743 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638753 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638764 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638774 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638785 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638796 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638845 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638855 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638865 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638875 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638885 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638895 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638906 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638919 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638929 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638940 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638949 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638959 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638968 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638978 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.638991 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639000 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639009 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639018 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639027 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639036 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639044 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639053 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639062 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639072 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639096 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639105 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639115 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639125 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639136 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639148 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639158 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639169 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639181 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639193 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639204 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639215 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639228 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639239 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639249 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639260 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639270 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639280 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639289 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639299 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639308 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639318 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639330 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639342 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639351 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639363 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639373 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639384 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639395 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639407 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639419 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639433 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639445 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639458 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639470 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639482 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639494 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639505 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639516 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639528 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639539 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639551 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639563 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639577 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639590 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639602 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639615 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639629 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.639639 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.642865 4790 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.642981 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643058 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643120 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643179 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643239 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643295 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643352 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643413 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643468 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643530 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643636 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643695 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643757 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643820 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643907 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.643966 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.644025 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.644094 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.644156 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.644212 4790 reconstruct.go:97] "Volume reconstruction finished" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.644268 4790 reconciler.go:26] "Reconciler: start to sync state" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.648139 4790 manager.go:324] Recovery completed Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.659759 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.664105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.664164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.664178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.666794 4790 cpu_manager.go:225] "Starting CPU manager" policy="none" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.666846 4790 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.666878 4790 state_mem.go:36] "Initialized new in-memory state store" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.671307 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.673761 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.673877 4790 status_manager.go:217] "Starting to sync pod status with apiserver" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.674183 4790 kubelet.go:2335] "Starting kubelet main sync loop" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.674234 4790 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 06 11:57:01 crc kubenswrapper[4790]: W0406 11:57:01.676283 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.676413 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.685714 4790 policy_none.go:49] "None policy: Start" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.686772 4790 memory_manager.go:170] "Starting memorymanager" policy="None" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.686806 4790 state_mem.go:35] "Initializing new in-memory state store" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.718775 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737030 4790 manager.go:334] "Starting Device Plugin manager" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737111 4790 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737126 4790 server.go:79] "Starting device plugin registration server" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737585 4790 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737607 4790 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737839 4790 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737907 4790 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.737913 4790 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.744562 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.775203 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.775299 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.776411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.776448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.776457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.777309 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.777944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.777991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778072 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778113 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778183 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778737 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.778771 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779330 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.779781 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.780948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781076 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781293 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781323 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.781967 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782096 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.782622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.821755 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.838527 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.839643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.839677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.839685 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.839710 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:01 crc kubenswrapper[4790]: E0406 11:57:01.840176 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846391 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846429 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846459 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846528 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.846917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947905 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.947992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948048 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948056 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948067 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948197 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948264 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948375 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948498 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Apr 06 11:57:01 crc kubenswrapper[4790]: I0406 11:57:01.948597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.040568 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.041713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.041756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.041767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.041809 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:02 crc kubenswrapper[4790]: E0406 11:57:02.042330 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.104126 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.117076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.124748 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.141718 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.146881 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:02 crc kubenswrapper[4790]: W0406 11:57:02.147338 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-064b27a8621e0316968d08d560997f168f5ab6d26c08ff645d6eee819b516a28 WatchSource:0}: Error finding container 064b27a8621e0316968d08d560997f168f5ab6d26c08ff645d6eee819b516a28: Status 404 returned error can't find the container with id 064b27a8621e0316968d08d560997f168f5ab6d26c08ff645d6eee819b516a28 Apr 06 11:57:02 crc kubenswrapper[4790]: W0406 11:57:02.168084 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6b9dc175454fdf58a70c9257565c92f5730b5a0b3055bcfed3c5bf121d69d8ae WatchSource:0}: Error finding container 6b9dc175454fdf58a70c9257565c92f5730b5a0b3055bcfed3c5bf121d69d8ae: Status 404 returned error can't find the container with id 6b9dc175454fdf58a70c9257565c92f5730b5a0b3055bcfed3c5bf121d69d8ae Apr 06 11:57:02 crc kubenswrapper[4790]: W0406 11:57:02.169749 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0625bba6ea54e575b2c6e00451476436b8108b21930185b2bbc4eff000d1a28e WatchSource:0}: Error finding container 0625bba6ea54e575b2c6e00451476436b8108b21930185b2bbc4eff000d1a28e: Status 404 returned error can't find the container with id 0625bba6ea54e575b2c6e00451476436b8108b21930185b2bbc4eff000d1a28e Apr 06 11:57:02 crc kubenswrapper[4790]: E0406 11:57:02.223191 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.442998 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.444599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.444632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.444642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.444666 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:02 crc kubenswrapper[4790]: E0406 11:57:02.445061 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.616693 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.677653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b9dc175454fdf58a70c9257565c92f5730b5a0b3055bcfed3c5bf121d69d8ae"} Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.678415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e45e1a621585384898e4fbcfa48e69eda21220866b9cc83911058b0b607ace1d"} Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.678994 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5d032bcae4571533731ee8a4b186c666d78c8f785cda9618c383222e75fd64de"} Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.679653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"064b27a8621e0316968d08d560997f168f5ab6d26c08ff645d6eee819b516a28"} Apr 06 11:57:02 crc kubenswrapper[4790]: I0406 11:57:02.681087 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0625bba6ea54e575b2c6e00451476436b8108b21930185b2bbc4eff000d1a28e"} Apr 06 11:57:02 crc kubenswrapper[4790]: W0406 11:57:02.700301 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:02 crc kubenswrapper[4790]: E0406 11:57:02.700434 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:02 crc kubenswrapper[4790]: W0406 11:57:02.867053 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:02 crc kubenswrapper[4790]: E0406 11:57:02.867152 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:03 crc kubenswrapper[4790]: E0406 11:57:03.024132 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Apr 06 11:57:03 crc kubenswrapper[4790]: W0406 11:57:03.045054 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:03 crc kubenswrapper[4790]: E0406 11:57:03.045158 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:03 crc kubenswrapper[4790]: W0406 11:57:03.100697 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:03 crc kubenswrapper[4790]: E0406 11:57:03.100796 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.246003 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.249814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.249904 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.249922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.249962 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:03 crc kubenswrapper[4790]: E0406 11:57:03.250715 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.529690 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 06 11:57:03 crc kubenswrapper[4790]: E0406 11:57:03.531119 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.616661 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.685798 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="512ecc4b2068d96f99fe484656f014fedae4852c537e0b0aa6ee670247e3877f" exitCode=0 Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.685896 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"512ecc4b2068d96f99fe484656f014fedae4852c537e0b0aa6ee670247e3877f"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.685975 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.687318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.687358 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.687379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.688892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2582aeab7b9255af8891d3e76d41163036f00c578ec1d9a1223b1732d0941a3d"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.688915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"666fed052055c6c6372d36ac938b4e27b97ec0b944a3eea7e7956fcd6d9f058a"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.688929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.688940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14a5d345b8c42edbe390c0b15dd0f1df9a3b8bb13468ce448c5685d9ca370f6f"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.688965 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.690052 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.690066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.690074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.690981 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5" exitCode=0 Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.691035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.691089 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.692145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.692170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.692180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.693884 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cb500af477dd38407374904b091e93bb3d5474c81cd1a146305650c732e1829e" exitCode=0 Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.693983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cb500af477dd38407374904b091e93bb3d5474c81cd1a146305650c732e1829e"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.694043 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.694049 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.694944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.694974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.694986 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.695290 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.695373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.695389 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.696356 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8bc1c642463b1a6913bf9e86592b9996645d0e98e4daff09742352c3159c728e" exitCode=0 Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.696394 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8bc1c642463b1a6913bf9e86592b9996645d0e98e4daff09742352c3159c728e"} Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.696447 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.697295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.697337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.697349 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:03 crc kubenswrapper[4790]: I0406 11:57:03.904452 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.617373 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:04 crc kubenswrapper[4790]: E0406 11:57:04.624951 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Apr 06 11:57:04 crc kubenswrapper[4790]: W0406 11:57:04.695811 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:04 crc kubenswrapper[4790]: E0406 11:57:04.695936 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.708571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"43286ddd309fd46a3d6c104d273a7737af676a9fc12d58c5b2de11f9bad526be"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.708622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"30056923772f422514c90950e54b7bcd14f25a556777a669012dbbb7dbcb5b6d"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.708633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5ff3cd33d151386ab7f09a28ee0f37001f983d35e8389644ab342b12e21638f8"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.708707 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.710295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.710331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.710347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.713614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.713699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.713719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.713735 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.717494 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="df16a3dfe7c2bb1357d34a814433a71a35c8b4f613b24fa03102f4670e5e9529" exitCode=0 Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.717576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"df16a3dfe7c2bb1357d34a814433a71a35c8b4f613b24fa03102f4670e5e9529"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.717708 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.718696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.718733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.718745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.719809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1aed909b4f0a2d650660fd672a9684c97bfd210da8304226998a2331c236f267"} Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.719845 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.719848 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720699 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.720767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.851706 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.853295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.853351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.853365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:04 crc kubenswrapper[4790]: I0406 11:57:04.853400 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:04 crc kubenswrapper[4790]: E0406 11:57:04.854034 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Apr 06 11:57:05 crc kubenswrapper[4790]: E0406 11:57:05.067513 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18a3c2949a64a105 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,LastTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:05 crc kubenswrapper[4790]: W0406 11:57:05.122648 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 11:57:05 crc kubenswrapper[4790]: E0406 11:57:05.122782 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.298034 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.728310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc5e81b83801cde0aeca1938b8d69b0d1deaaa4493159eb9592fa0579424c894"} Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.728473 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.729939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.729981 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.729994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732183 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9430d8c8332252142607cc547bb4e8c4d7bb439e42c5e58ce8a0bee09da0c071" exitCode=0 Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732411 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9430d8c8332252142607cc547bb4e8c4d7bb439e42c5e58ce8a0bee09da0c071"} Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732745 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732802 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.732879 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734103 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.734887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735052 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735144 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:05 crc kubenswrapper[4790]: I0406 11:57:05.735216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.080919 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742155 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742207 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742466 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eae89bff69591b550eb993405739620388778466d0047029f0f0d31f84ab4c50"} Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7331ff563edd9732aa91c1532181b7ddf6f6ef9ef8ae03c14eb2e26b7cca5121"} Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59ab55264ae4fa2d000a887fb69479863b696b999b31bb387dee6274deb5147b"} Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.742784 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.743996 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.744059 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.744074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.744974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.745039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:06 crc kubenswrapper[4790]: I0406 11:57:06.745062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.171558 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.171734 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.172969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.173012 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.173022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.456763 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.736311 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.753544 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.753555 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7b42ad26aace8746996fce1dc90f6bddd8026f4937f1cbe3f748110cee055aa"} Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.753673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76c267dff54aa243bfae8d94520e433f1b8e7f5a92417912519caa78b09bd87e"} Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.753602 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.753712 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755259 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755283 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755438 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:07 crc kubenswrapper[4790]: I0406 11:57:07.755482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.054588 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.056348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.056395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.056407 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.056438 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.400082 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.756183 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.756225 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757259 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757311 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757324 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:08 crc kubenswrapper[4790]: I0406 11:57:08.757458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:09 crc kubenswrapper[4790]: I0406 11:57:09.759215 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:09 crc kubenswrapper[4790]: I0406 11:57:09.760106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:09 crc kubenswrapper[4790]: I0406 11:57:09.760148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:09 crc kubenswrapper[4790]: I0406 11:57:09.760162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:10 crc kubenswrapper[4790]: I0406 11:57:10.008065 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Apr 06 11:57:10 crc kubenswrapper[4790]: I0406 11:57:10.008234 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:10 crc kubenswrapper[4790]: I0406 11:57:10.009528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:10 crc kubenswrapper[4790]: I0406 11:57:10.009808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:10 crc kubenswrapper[4790]: I0406 11:57:10.010031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:11 crc kubenswrapper[4790]: E0406 11:57:11.744789 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.402747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.403153 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.404934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.405023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.405049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.411422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.767114 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.768905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.768995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.769024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:12 crc kubenswrapper[4790]: I0406 11:57:12.773868 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:13 crc kubenswrapper[4790]: I0406 11:57:13.132198 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:13 crc kubenswrapper[4790]: I0406 11:57:13.768798 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:13 crc kubenswrapper[4790]: I0406 11:57:13.770353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:13 crc kubenswrapper[4790]: I0406 11:57:13.770394 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:13 crc kubenswrapper[4790]: I0406 11:57:13.770405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:14 crc kubenswrapper[4790]: I0406 11:57:14.771469 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:14 crc kubenswrapper[4790]: I0406 11:57:14.773172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:14 crc kubenswrapper[4790]: I0406 11:57:14.773227 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:14 crc kubenswrapper[4790]: I0406 11:57:14.773245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:15 crc kubenswrapper[4790]: W0406 11:57:15.237732 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.237858 4790 trace.go:236] Trace[550130441]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Apr-2026 11:57:05.235) (total time: 10002ms): Apr 06 11:57:15 crc kubenswrapper[4790]: Trace[550130441]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:57:15.237) Apr 06 11:57:15 crc kubenswrapper[4790]: Trace[550130441]: [10.002003223s] [10.002003223s] END Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.237890 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Apr 06 11:57:15 crc kubenswrapper[4790]: W0406 11:57:15.533927 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.534011 4790 trace.go:236] Trace[1173593800]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Apr-2026 11:57:05.531) (total time: 10002ms): Apr 06 11:57:15 crc kubenswrapper[4790]: Trace[1173593800]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:57:15.533) Apr 06 11:57:15 crc kubenswrapper[4790]: Trace[1173593800]: [10.002073777s] [10.002073777s] END Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.534030 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.618413 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.924308 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.924423 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.930227 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 06 11:57:15 crc kubenswrapper[4790]: I0406 11:57:15.930325 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.933061 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18a3c2949a64a105 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,LastTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.933456 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 06 11:57:15 crc kubenswrapper[4790]: W0406 11:57:15.935321 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.935424 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.936181 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" interval="6.4s" Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.938798 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" node="crc" Apr 06 11:57:15 crc kubenswrapper[4790]: W0406 11:57:15.939367 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z Apr 06 11:57:15 crc kubenswrapper[4790]: E0406 11:57:15.939497 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.086512 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]log ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]etcd ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Apr 06 11:57:16 crc kubenswrapper[4790]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Apr 06 11:57:16 crc kubenswrapper[4790]: [-]poststarthook/crd-informer-synced failed: reason withheld Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Apr 06 11:57:16 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Apr 06 11:57:16 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]autoregister-completion ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Apr 06 11:57:16 crc kubenswrapper[4790]: livez check failed Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.086587 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.133157 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.133303 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.622016 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:16Z is after 2026-02-23T05:33:13Z Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.781897 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.785281 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc5e81b83801cde0aeca1938b8d69b0d1deaaa4493159eb9592fa0579424c894" exitCode=255 Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.785383 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc5e81b83801cde0aeca1938b8d69b0d1deaaa4493159eb9592fa0579424c894"} Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.785719 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.787084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.787143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.787165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:16 crc kubenswrapper[4790]: I0406 11:57:16.788115 4790 scope.go:117] "RemoveContainer" containerID="dc5e81b83801cde0aeca1938b8d69b0d1deaaa4493159eb9592fa0579424c894" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.184939 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.185213 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.186548 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.186601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.186614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.228042 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.621873 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:17Z is after 2026-02-23T05:33:13Z Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.790569 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.791822 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.794392 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" exitCode=255 Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.794452 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3"} Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.794597 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.794599 4790 scope.go:117] "RemoveContainer" containerID="dc5e81b83801cde0aeca1938b8d69b0d1deaaa4493159eb9592fa0579424c894" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.794777 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.796404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.797469 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:17 crc kubenswrapper[4790]: E0406 11:57:17.797875 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:17 crc kubenswrapper[4790]: I0406 11:57:17.816080 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.114290 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.400992 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.620677 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:18Z is after 2026-02-23T05:33:13Z Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.801210 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.806191 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.806191 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.807880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:18 crc kubenswrapper[4790]: I0406 11:57:18.808455 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:18 crc kubenswrapper[4790]: E0406 11:57:18.808638 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:19 crc kubenswrapper[4790]: W0406 11:57:19.328168 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:19Z is after 2026-02-23T05:33:13Z Apr 06 11:57:19 crc kubenswrapper[4790]: E0406 11:57:19.328283 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.627799 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:19Z is after 2026-02-23T05:33:13Z Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.808600 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.809640 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.809726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.809746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:19 crc kubenswrapper[4790]: I0406 11:57:19.810908 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:19 crc kubenswrapper[4790]: E0406 11:57:19.811237 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:20 crc kubenswrapper[4790]: W0406 11:57:20.544190 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:20Z is after 2026-02-23T05:33:13Z Apr 06 11:57:20 crc kubenswrapper[4790]: E0406 11:57:20.544317 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Apr 06 11:57:20 crc kubenswrapper[4790]: I0406 11:57:20.621000 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:20Z is after 2026-02-23T05:33:13Z Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.087681 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.087904 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.089272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.089322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.089333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.090019 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:21 crc kubenswrapper[4790]: E0406 11:57:21.090201 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.092738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.621139 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:57:21Z is after 2026-02-23T05:33:13Z Apr 06 11:57:21 crc kubenswrapper[4790]: E0406 11:57:21.745008 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.815281 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.816773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.816890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.816910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:21 crc kubenswrapper[4790]: I0406 11:57:21.818286 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:21 crc kubenswrapper[4790]: E0406 11:57:21.818889 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.339699 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.341648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.341722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.341750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.341799 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:22 crc kubenswrapper[4790]: E0406 11:57:22.344728 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:22 crc kubenswrapper[4790]: E0406 11:57:22.344975 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:22 crc kubenswrapper[4790]: I0406 11:57:22.623383 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:23 crc kubenswrapper[4790]: I0406 11:57:23.623004 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:24 crc kubenswrapper[4790]: I0406 11:57:24.598625 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Apr 06 11:57:24 crc kubenswrapper[4790]: I0406 11:57:24.623090 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:24 crc kubenswrapper[4790]: I0406 11:57:24.625156 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Apr 06 11:57:24 crc kubenswrapper[4790]: W0406 11:57:24.868206 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Apr 06 11:57:24 crc kubenswrapper[4790]: E0406 11:57:24.868302 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:25 crc kubenswrapper[4790]: I0406 11:57:25.623631 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:25 crc kubenswrapper[4790]: W0406 11:57:25.701663 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.701762 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.942449 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949a64a105 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,LastTimestamp:2026-04-06 11:57:01.615341829 +0000 UTC m=+0.603084695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.950483 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.957185 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.963210 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.970265 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c294a1cc1968 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.739563368 +0000 UTC m=+0.727306234,LastTimestamp:2026-04-06 11:57:01.739563368 +0000 UTC m=+0.727306234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.977039 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.776437678 +0000 UTC m=+0.764180544,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.982788 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.776454944 +0000 UTC m=+0.764197810,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.991204 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.776463303 +0000 UTC m=+0.764206169,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:25 crc kubenswrapper[4790]: E0406 11:57:25.999480 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.777964005 +0000 UTC m=+0.765706871,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.004157 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.777999147 +0000 UTC m=+0.765742013,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.009282 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.778010595 +0000 UTC m=+0.765753461,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.015162 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.779174408 +0000 UTC m=+0.766917274,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.021111 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.779188505 +0000 UTC m=+0.766931371,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.026823 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.779196004 +0000 UTC m=+0.766938870,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.033732 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.779625033 +0000 UTC m=+0.767367899,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.042116 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.779642299 +0000 UTC m=+0.767385165,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.047709 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.779662425 +0000 UTC m=+0.767405301,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.053732 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.780124887 +0000 UTC m=+0.767867753,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.058970 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.780139214 +0000 UTC m=+0.767882080,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.064647 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.780147742 +0000 UTC m=+0.767890608,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.070515 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.780925288 +0000 UTC m=+0.768668144,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.076131 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.780945103 +0000 UTC m=+0.768687969,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.083921 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4de70a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4de70a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664184074 +0000 UTC m=+0.651926940,LastTimestamp:2026-04-06 11:57:01.780953252 +0000 UTC m=+0.768696108,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.091434 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4d5928\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4d5928 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664147752 +0000 UTC m=+0.651890618,LastTimestamp:2026-04-06 11:57:01.781863229 +0000 UTC m=+0.769606095,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.097232 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18a3c2949d4dc0d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18a3c2949d4dc0d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:01.664174296 +0000 UTC m=+0.651917162,LastTimestamp:2026-04-06 11:57:01.781875886 +0000 UTC m=+0.769618752,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.104521 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c294ba95da95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.155438741 +0000 UTC m=+1.143181637,LastTimestamp:2026-04-06 11:57:02.155438741 +0000 UTC m=+1.143181637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.110715 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c294ba962ea3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.155460259 +0000 UTC m=+1.143203135,LastTimestamp:2026-04-06 11:57:02.155460259 +0000 UTC m=+1.143203135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.115662 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c294babc00a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.157938848 +0000 UTC m=+1.145681744,LastTimestamp:2026-04-06 11:57:02.157938848 +0000 UTC m=+1.145681744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.120488 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294bc346ea6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.18260855 +0000 UTC m=+1.170351416,LastTimestamp:2026-04-06 11:57:02.18260855 +0000 UTC m=+1.170351416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.125515 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c294bc358cb8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.182681784 +0000 UTC m=+1.170424680,LastTimestamp:2026-04-06 11:57:02.182681784 +0000 UTC m=+1.170424680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.132252 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294d9e3abd9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.680632281 +0000 UTC m=+1.668375147,LastTimestamp:2026-04-06 11:57:02.680632281 +0000 UTC m=+1.668375147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: I0406 11:57:26.132795 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 11:57:26 crc kubenswrapper[4790]: I0406 11:57:26.132895 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.140302 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c294d9e46601 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.680679937 +0000 UTC m=+1.668422803,LastTimestamp:2026-04-06 11:57:02.680679937 +0000 UTC m=+1.668422803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.146281 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c294da073dcb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.682963403 +0000 UTC m=+1.670706269,LastTimestamp:2026-04-06 11:57:02.682963403 +0000 UTC m=+1.670706269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.151894 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c294da0c3837 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.683289655 +0000 UTC m=+1.671032521,LastTimestamp:2026-04-06 11:57:02.683289655 +0000 UTC m=+1.671032521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.159041 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c294da5b9f82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.688493442 +0000 UTC m=+1.676236308,LastTimestamp:2026-04-06 11:57:02.688493442 +0000 UTC m=+1.676236308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.164915 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294da8c07da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.691665882 +0000 UTC m=+1.679408748,LastTimestamp:2026-04-06 11:57:02.691665882 +0000 UTC m=+1.679408748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.169703 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294da9ed320 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.692897568 +0000 UTC m=+1.680640444,LastTimestamp:2026-04-06 11:57:02.692897568 +0000 UTC m=+1.680640444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.175311 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c294dac5869a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.695433882 +0000 UTC m=+1.683176758,LastTimestamp:2026-04-06 11:57:02.695433882 +0000 UTC m=+1.683176758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.185422 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c294db13b2e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.700557026 +0000 UTC m=+1.688299892,LastTimestamp:2026-04-06 11:57:02.700557026 +0000 UTC m=+1.688299892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.191291 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c294db37d5ef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.702925295 +0000 UTC m=+1.690668161,LastTimestamp:2026-04-06 11:57:02.702925295 +0000 UTC m=+1.690668161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.196472 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c294db6a17a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.706218915 +0000 UTC m=+1.693961781,LastTimestamp:2026-04-06 11:57:02.706218915 +0000 UTC m=+1.693961781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.200814 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294ed90acd4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.010737364 +0000 UTC m=+1.998480270,LastTimestamp:2026-04-06 11:57:03.010737364 +0000 UTC m=+1.998480270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.205186 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294ee08189a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.018563738 +0000 UTC m=+2.006306644,LastTimestamp:2026-04-06 11:57:03.018563738 +0000 UTC m=+2.006306644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.209017 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294ee1fa135 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.020106037 +0000 UTC m=+2.007848943,LastTimestamp:2026-04-06 11:57:03.020106037 +0000 UTC m=+2.007848943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.213510 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294fa1304ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.220606123 +0000 UTC m=+2.208348989,LastTimestamp:2026-04-06 11:57:03.220606123 +0000 UTC m=+2.208348989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.219860 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294fad8b57d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.233561981 +0000 UTC m=+2.221304837,LastTimestamp:2026-04-06 11:57:03.233561981 +0000 UTC m=+2.221304837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.223308 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294faea117d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.234699645 +0000 UTC m=+2.222442511,LastTimestamp:2026-04-06 11:57:03.234699645 +0000 UTC m=+2.222442511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.228023 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c29506ca94f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.433962736 +0000 UTC m=+2.421705602,LastTimestamp:2026-04-06 11:57:03.433962736 +0000 UTC m=+2.421705602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.233382 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c2950745cf8e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.44203867 +0000 UTC m=+2.429781566,LastTimestamp:2026-04-06 11:57:03.44203867 +0000 UTC m=+2.429781566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.238003 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c29515fc1f1f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.688867615 +0000 UTC m=+2.676610481,LastTimestamp:2026-04-06 11:57:03.688867615 +0000 UTC m=+2.676610481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.243422 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c29516492022 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.693914146 +0000 UTC m=+2.681657012,LastTimestamp:2026-04-06 11:57:03.693914146 +0000 UTC m=+2.681657012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.248235 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c2951695f930 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.698950448 +0000 UTC m=+2.686693324,LastTimestamp:2026-04-06 11:57:03.698950448 +0000 UTC m=+2.686693324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.253623 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c2951696d99f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.699007903 +0000 UTC m=+2.686750809,LastTimestamp:2026-04-06 11:57:03.699007903 +0000 UTC m=+2.686750809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.259133 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c2952620ec8f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.959714959 +0000 UTC m=+2.947457825,LastTimestamp:2026-04-06 11:57:03.959714959 +0000 UTC m=+2.947457825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.266435 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c295264de298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.962661528 +0000 UTC m=+2.950404394,LastTimestamp:2026-04-06 11:57:03.962661528 +0000 UTC m=+2.950404394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.270461 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295264f22c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.962743491 +0000 UTC m=+2.950486357,LastTimestamp:2026-04-06 11:57:03.962743491 +0000 UTC m=+2.950486357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.275989 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c29526503635 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.962814005 +0000 UTC m=+2.950556871,LastTimestamp:2026-04-06 11:57:03.962814005 +0000 UTC m=+2.950556871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.280305 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18a3c29526f6bd31 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.973727537 +0000 UTC m=+2.961470413,LastTimestamp:2026-04-06 11:57:03.973727537 +0000 UTC m=+2.961470413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.286425 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c295270d2846 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.975196742 +0000 UTC m=+2.962939608,LastTimestamp:2026-04-06 11:57:03.975196742 +0000 UTC m=+2.962939608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.292108 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c295271c2ba8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.976180648 +0000 UTC m=+2.963923504,LastTimestamp:2026-04-06 11:57:03.976180648 +0000 UTC m=+2.963923504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.296795 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c295272a3646 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.97710087 +0000 UTC m=+2.964843736,LastTimestamp:2026-04-06 11:57:03.97710087 +0000 UTC m=+2.964843736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.302068 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c2952733b097 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.977722007 +0000 UTC m=+2.965464873,LastTimestamp:2026-04-06 11:57:03.977722007 +0000 UTC m=+2.965464873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.308448 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c2952741cb06 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.978646278 +0000 UTC m=+2.966389144,LastTimestamp:2026-04-06 11:57:03.978646278 +0000 UTC m=+2.966389144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.318419 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c29531831bbd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.150698941 +0000 UTC m=+3.138441827,LastTimestamp:2026-04-06 11:57:04.150698941 +0000 UTC m=+3.138441827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.325400 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c2953197a2f0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.152044272 +0000 UTC m=+3.139787138,LastTimestamp:2026-04-06 11:57:04.152044272 +0000 UTC m=+3.139787138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.331068 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c29532788474 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.166782068 +0000 UTC m=+3.154524934,LastTimestamp:2026-04-06 11:57:04.166782068 +0000 UTC m=+3.154524934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.338040 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c295328ca5dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.16810134 +0000 UTC m=+3.155844216,LastTimestamp:2026-04-06 11:57:04.16810134 +0000 UTC m=+3.155844216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.343404 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c2953294880d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.168617997 +0000 UTC m=+3.156360863,LastTimestamp:2026-04-06 11:57:04.168617997 +0000 UTC m=+3.156360863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.347588 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c29532a60fc8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.169766856 +0000 UTC m=+3.157509722,LastTimestamp:2026-04-06 11:57:04.169766856 +0000 UTC m=+3.157509722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.353023 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c2953df1863f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.359261759 +0000 UTC m=+3.347004635,LastTimestamp:2026-04-06 11:57:04.359261759 +0000 UTC m=+3.347004635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.357969 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2953e17808e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.36175067 +0000 UTC m=+3.349493576,LastTimestamp:2026-04-06 11:57:04.36175067 +0000 UTC m=+3.349493576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.362792 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18a3c2953f089c19 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.377551897 +0000 UTC m=+3.365294763,LastTimestamp:2026-04-06 11:57:04.377551897 +0000 UTC m=+3.365294763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.369014 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2953f3a3052 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.380801106 +0000 UTC m=+3.368543972,LastTimestamp:2026-04-06 11:57:04.380801106 +0000 UTC m=+3.368543972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.373614 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2953f54bd90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.3825412 +0000 UTC m=+3.370284056,LastTimestamp:2026-04-06 11:57:04.3825412 +0000 UTC m=+3.370284056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.378805 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2954b8e26a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.587630241 +0000 UTC m=+3.575373127,LastTimestamp:2026-04-06 11:57:04.587630241 +0000 UTC m=+3.575373127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.383083 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2954c86a73f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.603916095 +0000 UTC m=+3.591658971,LastTimestamp:2026-04-06 11:57:04.603916095 +0000 UTC m=+3.591658971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.387136 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2954c98df92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.605110162 +0000 UTC m=+3.592853038,LastTimestamp:2026-04-06 11:57:04.605110162 +0000 UTC m=+3.592853038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.394620 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c2955376d7d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.720320465 +0000 UTC m=+3.708063331,LastTimestamp:2026-04-06 11:57:04.720320465 +0000 UTC m=+3.708063331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.401928 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c295578463b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.78831711 +0000 UTC m=+3.776059976,LastTimestamp:2026-04-06 11:57:04.78831711 +0000 UTC m=+3.776059976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.408135 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c29558b4d407 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.808268807 +0000 UTC m=+3.796011673,LastTimestamp:2026-04-06 11:57:04.808268807 +0000 UTC m=+3.796011673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.414012 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295620769bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.964676029 +0000 UTC m=+3.952418895,LastTimestamp:2026-04-06 11:57:04.964676029 +0000 UTC m=+3.952418895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.420361 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c29562b9f704 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.976377604 +0000 UTC m=+3.964120470,LastTimestamp:2026-04-06 11:57:04.976377604 +0000 UTC m=+3.964120470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.427144 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295901c1dff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:05.737784831 +0000 UTC m=+4.725527757,LastTimestamp:2026-04-06 11:57:05.737784831 +0000 UTC m=+4.725527757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.431544 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c2959f793c73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:05.995545715 +0000 UTC m=+4.983288621,LastTimestamp:2026-04-06 11:57:05.995545715 +0000 UTC m=+4.983288621,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.437718 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295a05100fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.009686266 +0000 UTC m=+4.997429142,LastTimestamp:2026-04-06 11:57:06.009686266 +0000 UTC m=+4.997429142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.443410 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295a0678dc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.011164105 +0000 UTC m=+4.998907011,LastTimestamp:2026-04-06 11:57:06.011164105 +0000 UTC m=+4.998907011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.448413 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295af9851eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.266018283 +0000 UTC m=+5.253761149,LastTimestamp:2026-04-06 11:57:06.266018283 +0000 UTC m=+5.253761149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.453446 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295b069ce1f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.279747103 +0000 UTC m=+5.267489979,LastTimestamp:2026-04-06 11:57:06.279747103 +0000 UTC m=+5.267489979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.457550 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295b07bbff4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.280923124 +0000 UTC m=+5.268666000,LastTimestamp:2026-04-06 11:57:06.280923124 +0000 UTC m=+5.268666000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.464239 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295beb5131b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.519560987 +0000 UTC m=+5.507303863,LastTimestamp:2026-04-06 11:57:06.519560987 +0000 UTC m=+5.507303863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.468497 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295bfa3bf89 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.535202697 +0000 UTC m=+5.522945573,LastTimestamp:2026-04-06 11:57:06.535202697 +0000 UTC m=+5.522945573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.473911 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295bfc145f0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.537137648 +0000 UTC m=+5.524880524,LastTimestamp:2026-04-06 11:57:06.537137648 +0000 UTC m=+5.524880524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.481725 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295cfaaf2db openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.804110043 +0000 UTC m=+5.791852939,LastTimestamp:2026-04-06 11:57:06.804110043 +0000 UTC m=+5.791852939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.490150 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295d11f3c52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.828508242 +0000 UTC m=+5.816251138,LastTimestamp:2026-04-06 11:57:06.828508242 +0000 UTC m=+5.816251138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.496349 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295d13ef2bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:06.830586557 +0000 UTC m=+5.818329463,LastTimestamp:2026-04-06 11:57:06.830586557 +0000 UTC m=+5.818329463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.503272 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295e19e1f32 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:07.105259314 +0000 UTC m=+6.093002220,LastTimestamp:2026-04-06 11:57:07.105259314 +0000 UTC m=+6.093002220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.508964 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18a3c295e30c0059 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:07.129237593 +0000 UTC m=+6.116980449,LastTimestamp:2026-04-06 11:57:07.129237593 +0000 UTC m=+6.116980449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.519054 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Apr 06 11:57:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.18a3c297ef477ac8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Apr 06 11:57:26 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 06 11:57:26 crc kubenswrapper[4790]: Apr 06 11:57:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:15.924396744 +0000 UTC m=+14.912139610,LastTimestamp:2026-04-06 11:57:15.924396744 +0000 UTC m=+14.912139610,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:26 crc kubenswrapper[4790]: > Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.521059 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c297ef4894bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:15.924468925 +0000 UTC m=+14.912211811,LastTimestamp:2026-04-06 11:57:15.924468925 +0000 UTC m=+14.912211811,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.526411 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a3c297ef477ac8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Apr 06 11:57:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.18a3c297ef477ac8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Apr 06 11:57:26 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Apr 06 11:57:26 crc kubenswrapper[4790]: Apr 06 11:57:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:15.924396744 +0000 UTC m=+14.912139610,LastTimestamp:2026-04-06 11:57:15.930295038 +0000 UTC m=+14.918037904,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:26 crc kubenswrapper[4790]: > Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.531427 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a3c297ef4894bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c297ef4894bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:15.924468925 +0000 UTC m=+14.912211811,LastTimestamp:2026-04-06 11:57:15.93036091 +0000 UTC m=+14.918103776,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.538228 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Apr 06 11:57:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.18a3c297f8f20731 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Apr 06 11:57:26 crc kubenswrapper[4790]: body: [+]ping ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]log ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]etcd ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Apr 06 11:57:26 crc kubenswrapper[4790]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Apr 06 11:57:26 crc kubenswrapper[4790]: [-]poststarthook/crd-informer-synced failed: reason withheld Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Apr 06 11:57:26 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Apr 06 11:57:26 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]autoregister-completion ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Apr 06 11:57:26 crc kubenswrapper[4790]: livez check failed Apr 06 11:57:26 crc kubenswrapper[4790]: Apr 06 11:57:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.086568753 +0000 UTC m=+15.074311619,LastTimestamp:2026-04-06 11:57:16.086568753 +0000 UTC m=+15.074311619,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:26 crc kubenswrapper[4790]: > Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.547742 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c297f8f2a728 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.086609704 +0000 UTC m=+15.074352570,LastTimestamp:2026-04-06 11:57:16.086609704 +0000 UTC m=+15.074352570,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.554476 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 06 11:57:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.18a3c297fbba8055 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 06 11:57:26 crc kubenswrapper[4790]: body: Apr 06 11:57:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.133261397 +0000 UTC m=+15.121004303,LastTimestamp:2026-04-06 11:57:16.133261397 +0000 UTC m=+15.121004303,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:26 crc kubenswrapper[4790]: > Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.563379 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c297fbbbcaa7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.133345959 +0000 UTC m=+15.121088855,LastTimestamp:2026-04-06 11:57:16.133345959 +0000 UTC m=+15.121088855,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.571700 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18a3c2954c98df92\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18a3c2954c98df92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:04.605110162 +0000 UTC m=+3.592853038,LastTimestamp:2026-04-06 11:57:16.789497744 +0000 UTC m=+15.777240650,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.582367 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a3c297fbba8055\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 06 11:57:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.18a3c297fbba8055 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Apr 06 11:57:26 crc kubenswrapper[4790]: body: Apr 06 11:57:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.133261397 +0000 UTC m=+15.121004303,LastTimestamp:2026-04-06 11:57:26.132871892 +0000 UTC m=+25.120614768,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:26 crc kubenswrapper[4790]: > Apr 06 11:57:26 crc kubenswrapper[4790]: E0406 11:57:26.589272 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a3c297fbbbcaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c297fbbbcaa7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:16.133345959 +0000 UTC m=+15.121088855,LastTimestamp:2026-04-06 11:57:26.132932824 +0000 UTC m=+25.120675710,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:26 crc kubenswrapper[4790]: I0406 11:57:26.622561 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:27 crc kubenswrapper[4790]: I0406 11:57:27.622881 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:28 crc kubenswrapper[4790]: I0406 11:57:28.620897 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:28 crc kubenswrapper[4790]: W0406 11:57:28.647324 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Apr 06 11:57:28 crc kubenswrapper[4790]: E0406 11:57:28.647416 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.345263 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.347242 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.347326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.347357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.347409 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:29 crc kubenswrapper[4790]: E0406 11:57:29.352615 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:29 crc kubenswrapper[4790]: E0406 11:57:29.353551 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:29 crc kubenswrapper[4790]: I0406 11:57:29.623484 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:30 crc kubenswrapper[4790]: W0406 11:57:30.331114 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:30 crc kubenswrapper[4790]: E0406 11:57:30.331961 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:30 crc kubenswrapper[4790]: I0406 11:57:30.620582 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:31 crc kubenswrapper[4790]: I0406 11:57:31.622804 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:31 crc kubenswrapper[4790]: E0406 11:57:31.745371 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:32 crc kubenswrapper[4790]: I0406 11:57:32.623151 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.624200 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.866917 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:36998->192.168.126.11:10357: read: connection reset by peer" start-of-body= Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.867036 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:36998->192.168.126.11:10357: read: connection reset by peer" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.867127 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.867378 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.869201 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.869345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.869414 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.870078 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Apr 06 11:57:33 crc kubenswrapper[4790]: I0406 11:57:33.870372 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2" gracePeriod=30 Apr 06 11:57:33 crc kubenswrapper[4790]: E0406 11:57:33.876068 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Apr 06 11:57:33 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.18a3c29c1cbe0158 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:36998->192.168.126.11:10357: read: connection reset by peer Apr 06 11:57:33 crc kubenswrapper[4790]: body: Apr 06 11:57:33 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:33.867008344 +0000 UTC m=+32.854751240,LastTimestamp:2026-04-06 11:57:33.867008344 +0000 UTC m=+32.854751240,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Apr 06 11:57:33 crc kubenswrapper[4790]: > Apr 06 11:57:33 crc kubenswrapper[4790]: E0406 11:57:33.883688 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c29c1cbf14a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:36998->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:33.867078816 +0000 UTC m=+32.854821712,LastTimestamp:2026-04-06 11:57:33.867078816 +0000 UTC m=+32.854821712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:33 crc kubenswrapper[4790]: E0406 11:57:33.891208 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c29c1cf0fcdc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:33.870349532 +0000 UTC m=+32.858092398,LastTimestamp:2026-04-06 11:57:33.870349532 +0000 UTC m=+32.858092398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:34 crc kubenswrapper[4790]: E0406 11:57:34.405401 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a3c294da9ed320\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294da9ed320 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:02.692897568 +0000 UTC m=+1.680640444,LastTimestamp:2026-04-06 11:57:34.397042054 +0000 UTC m=+33.384784960,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.623235 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:34 crc kubenswrapper[4790]: E0406 11:57:34.654674 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a3c294ed90acd4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294ed90acd4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.010737364 +0000 UTC m=+1.998480270,LastTimestamp:2026-04-06 11:57:34.648986436 +0000 UTC m=+33.636729302,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:34 crc kubenswrapper[4790]: E0406 11:57:34.670210 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18a3c294ee08189a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18a3c294ee08189a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:57:03.018563738 +0000 UTC m=+2.006306644,LastTimestamp:2026-04-06 11:57:34.661816592 +0000 UTC m=+33.649559468,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.863905 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.864460 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2" exitCode=255 Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.864520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2"} Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.864596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86d1b7c8d0d0d435b0c335d9e6690875a3f71d8c562db2d731ac0de6d1a26758"} Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.864762 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.866329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.866399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:34 crc kubenswrapper[4790]: I0406 11:57:34.866414 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.622495 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.675454 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.677159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.677211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.677230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:35 crc kubenswrapper[4790]: I0406 11:57:35.678074 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.353504 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.354959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.355014 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.355025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.355068 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:36 crc kubenswrapper[4790]: E0406 11:57:36.358673 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:36 crc kubenswrapper[4790]: E0406 11:57:36.358786 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.621123 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.871689 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.872325 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.875127 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" exitCode=255 Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.875177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d"} Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.875250 4790 scope.go:117] "RemoveContainer" containerID="3af1e86511ca47c6cacd98b86cade32c55499829cb272c0044ded199b5c23bd3" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.875403 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.876671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.876705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.876740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:36 crc kubenswrapper[4790]: I0406 11:57:36.877381 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:57:36 crc kubenswrapper[4790]: E0406 11:57:36.877562 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.172032 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.172253 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.173419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.173459 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.173472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.621032 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:37 crc kubenswrapper[4790]: I0406 11:57:37.880568 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.114498 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.114753 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.116423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.116473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.116485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.117598 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:57:38 crc kubenswrapper[4790]: E0406 11:57:38.117828 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.400962 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.623252 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.887725 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.890371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.890430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.890498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:38 crc kubenswrapper[4790]: I0406 11:57:38.891470 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:57:38 crc kubenswrapper[4790]: E0406 11:57:38.891776 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:39 crc kubenswrapper[4790]: I0406 11:57:39.623586 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:40 crc kubenswrapper[4790]: I0406 11:57:40.621728 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:41 crc kubenswrapper[4790]: I0406 11:57:41.621484 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:41 crc kubenswrapper[4790]: E0406 11:57:41.746573 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:42 crc kubenswrapper[4790]: I0406 11:57:42.620319 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.132129 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.133004 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.134648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.134719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.134738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.140419 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:43 crc kubenswrapper[4790]: W0406 11:57:43.261160 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Apr 06 11:57:43 crc kubenswrapper[4790]: E0406 11:57:43.261240 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.358784 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.361541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.361595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.361615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.361654 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:43 crc kubenswrapper[4790]: E0406 11:57:43.362756 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:43 crc kubenswrapper[4790]: E0406 11:57:43.372071 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.623384 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.900952 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.901867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.901911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:43 crc kubenswrapper[4790]: I0406 11:57:43.901922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:44 crc kubenswrapper[4790]: I0406 11:57:44.624361 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:44 crc kubenswrapper[4790]: W0406 11:57:44.985314 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Apr 06 11:57:44 crc kubenswrapper[4790]: E0406 11:57:44.985401 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:45 crc kubenswrapper[4790]: I0406 11:57:45.626735 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:46 crc kubenswrapper[4790]: I0406 11:57:46.624436 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.177572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.177765 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.179381 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.179418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.179430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:47 crc kubenswrapper[4790]: W0406 11:57:47.337331 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Apr 06 11:57:47 crc kubenswrapper[4790]: E0406 11:57:47.337406 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:47 crc kubenswrapper[4790]: I0406 11:57:47.623306 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:48 crc kubenswrapper[4790]: I0406 11:57:48.622755 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:49 crc kubenswrapper[4790]: I0406 11:57:49.623232 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:50 crc kubenswrapper[4790]: E0406 11:57:50.371442 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.372402 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.374163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.374213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.374223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.374254 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:50 crc kubenswrapper[4790]: E0406 11:57:50.379066 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:50 crc kubenswrapper[4790]: I0406 11:57:50.623333 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:51 crc kubenswrapper[4790]: W0406 11:57:51.406470 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:51 crc kubenswrapper[4790]: E0406 11:57:51.406566 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Apr 06 11:57:51 crc kubenswrapper[4790]: I0406 11:57:51.623878 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:51 crc kubenswrapper[4790]: E0406 11:57:51.747338 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.624670 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.675434 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.677114 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.677172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.677196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:52 crc kubenswrapper[4790]: I0406 11:57:52.678176 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:57:52 crc kubenswrapper[4790]: E0406 11:57:52.678489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:57:53 crc kubenswrapper[4790]: I0406 11:57:53.624746 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:54 crc kubenswrapper[4790]: I0406 11:57:54.622073 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.305536 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.305786 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.307726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.307789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.307808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:55 crc kubenswrapper[4790]: I0406 11:57:55.623943 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:56 crc kubenswrapper[4790]: I0406 11:57:56.623907 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.379490 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:57:57 crc kubenswrapper[4790]: E0406 11:57:57.380313 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.381500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.381683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.381717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.381776 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:57:57 crc kubenswrapper[4790]: E0406 11:57:57.389689 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:57:57 crc kubenswrapper[4790]: I0406 11:57:57.624373 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:58 crc kubenswrapper[4790]: I0406 11:57:58.622166 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:57:59 crc kubenswrapper[4790]: I0406 11:57:59.623057 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:00 crc kubenswrapper[4790]: I0406 11:58:00.621485 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:01 crc kubenswrapper[4790]: I0406 11:58:01.621667 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:01 crc kubenswrapper[4790]: E0406 11:58:01.748300 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:58:02 crc kubenswrapper[4790]: I0406 11:58:02.623209 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:03 crc kubenswrapper[4790]: I0406 11:58:03.625119 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:04 crc kubenswrapper[4790]: E0406 11:58:04.388818 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.389911 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.391636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.391695 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.391714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.391753 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:58:04 crc kubenswrapper[4790]: E0406 11:58:04.398486 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Apr 06 11:58:04 crc kubenswrapper[4790]: I0406 11:58:04.621903 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.618021 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.674979 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.676452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.676514 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.676537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:05 crc kubenswrapper[4790]: I0406 11:58:05.677780 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.399669 4790 csr.go:261] certificate signing request csr-jgmt6 is approved, waiting to be issued Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.409058 4790 csr.go:257] certificate signing request csr-jgmt6 is issued Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.422281 4790 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.503388 4790 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.977590 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.978381 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.980687 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" exitCode=255 Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.980732 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b"} Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.980774 4790 scope.go:117] "RemoveContainer" containerID="f48e3c86d27d0961019f0d669b0f8b0fddac97fa1bd3ba0640323b68fe45559d" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.980927 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.984540 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.984570 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.984582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:06 crc kubenswrapper[4790]: I0406 11:58:06.985241 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:06 crc kubenswrapper[4790]: E0406 11:58:06.985412 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:07 crc kubenswrapper[4790]: I0406 11:58:07.410311 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 09:56:04.123505638 +0000 UTC Apr 06 11:58:07 crc kubenswrapper[4790]: I0406 11:58:07.410373 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6189h57m56.713134865s for next certificate rotation Apr 06 11:58:07 crc kubenswrapper[4790]: I0406 11:58:07.985070 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.114029 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.114231 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.116005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.116064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.116076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.116769 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:08 crc kubenswrapper[4790]: E0406 11:58:08.117003 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.400703 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.990337 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.991386 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.991428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.991441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:08 crc kubenswrapper[4790]: I0406 11:58:08.992119 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:08 crc kubenswrapper[4790]: E0406 11:58:08.992296 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.399450 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.400718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.400758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.400769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.400900 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.412406 4790 kubelet_node_status.go:115] "Node was previously registered" node="crc" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.412992 4790 kubelet_node_status.go:79] "Successfully registered node" node="crc" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.413043 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.417412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.417470 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.417493 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.417525 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.417552 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:11Z","lastTransitionTime":"2026-04-06T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.441542 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.452136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.452176 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.452190 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.452211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.452226 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:11Z","lastTransitionTime":"2026-04-06T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.473904 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.481114 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.481185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.481206 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.481233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.481260 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:11Z","lastTransitionTime":"2026-04-06T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.492243 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.499717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.499763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.499776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.499798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:11 crc kubenswrapper[4790]: I0406 11:58:11.499811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:11Z","lastTransitionTime":"2026-04-06T11:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.510637 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.511032 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.511078 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.611441 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.712549 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.749340 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.812689 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:11 crc kubenswrapper[4790]: E0406 11:58:11.913525 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.014521 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.115155 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.216127 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.317152 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.418303 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.518800 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.619420 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.720217 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.820661 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:12 crc kubenswrapper[4790]: E0406 11:58:12.921469 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.021913 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.122430 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.223590 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.324348 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.425488 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.525889 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.626230 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.726402 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.826755 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:13 crc kubenswrapper[4790]: E0406 11:58:13.927937 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.028466 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.129261 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.230450 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.330893 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.431876 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.532719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.633262 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.733950 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.834666 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:14 crc kubenswrapper[4790]: E0406 11:58:14.935814 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.036760 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.137568 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.238717 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.339103 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.440219 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.540709 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.641483 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.742316 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.843399 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:15 crc kubenswrapper[4790]: E0406 11:58:15.943730 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.043921 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.144680 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.245777 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.346383 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.447227 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.547971 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.649683 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.749901 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.850848 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:16 crc kubenswrapper[4790]: E0406 11:58:16.951497 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.051641 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.152066 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.252860 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.353162 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.453402 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.554325 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.655238 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.756311 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.857244 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:17 crc kubenswrapper[4790]: E0406 11:58:17.958156 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.058963 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.160121 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.260561 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.360819 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.461921 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.563094 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.663726 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.764004 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.864475 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:18 crc kubenswrapper[4790]: E0406 11:58:18.965445 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.065968 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.166221 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.266794 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.367624 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.468220 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.569454 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.669717 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.770067 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.870874 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:19 crc kubenswrapper[4790]: E0406 11:58:19.971334 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.072195 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.173445 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.274244 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.375215 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.475503 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.576994 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.677202 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.778093 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.878424 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:20 crc kubenswrapper[4790]: E0406 11:58:20.979625 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.080355 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.180568 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.280719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.381918 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.482394 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.582685 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.655596 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.665243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.665302 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.665315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.665336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.665351 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:21Z","lastTransitionTime":"2026-04-06T11:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.675512 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.676780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.676844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.676859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.679542 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.689646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.689865 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.689891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.689925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.689952 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:21Z","lastTransitionTime":"2026-04-06T11:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.703097 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.710744 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.710797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.710819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.710885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.710909 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:21Z","lastTransitionTime":"2026-04-06T11:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.726082 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.734360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.734416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.734433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.734465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:21 crc kubenswrapper[4790]: I0406 11:58:21.734485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:21Z","lastTransitionTime":"2026-04-06T11:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.748121 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.748255 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.748291 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.750253 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.848699 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:21 crc kubenswrapper[4790]: E0406 11:58:21.949950 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.054746 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.155299 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.256092 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.356755 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.457450 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: I0406 11:58:22.507125 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.558685 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.659409 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.759868 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.861068 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:22 crc kubenswrapper[4790]: E0406 11:58:22.961448 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.062550 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.163051 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.264024 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.365159 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.465488 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.565915 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.666307 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.766940 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.867244 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:23 crc kubenswrapper[4790]: I0406 11:58:23.882610 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 06 11:58:23 crc kubenswrapper[4790]: E0406 11:58:23.968343 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.069409 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.169723 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.270744 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.371565 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.472549 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.573287 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.674051 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: I0406 11:58:24.675456 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:24 crc kubenswrapper[4790]: I0406 11:58:24.676665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:24 crc kubenswrapper[4790]: I0406 11:58:24.676731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:24 crc kubenswrapper[4790]: I0406 11:58:24.676761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:24 crc kubenswrapper[4790]: I0406 11:58:24.677752 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.678139 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.774412 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.874545 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:24 crc kubenswrapper[4790]: E0406 11:58:24.974910 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.075054 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.175596 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.276339 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.377026 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.477608 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.578100 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.678903 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.780095 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.882884 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:25 crc kubenswrapper[4790]: E0406 11:58:25.983552 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.084886 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.185716 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.286781 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.387853 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.488202 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.589074 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.690020 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.791201 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.891623 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:26 crc kubenswrapper[4790]: E0406 11:58:26.991745 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.092417 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.193412 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.293786 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.393977 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.494783 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.595743 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.696951 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.797092 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.898217 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:27 crc kubenswrapper[4790]: E0406 11:58:27.998358 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.098972 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.199252 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.299975 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.400792 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.501239 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.601963 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: I0406 11:58:28.674473 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Apr 06 11:58:28 crc kubenswrapper[4790]: I0406 11:58:28.675804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:28 crc kubenswrapper[4790]: I0406 11:58:28.675845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:28 crc kubenswrapper[4790]: I0406 11:58:28.675853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.702632 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.802801 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:28 crc kubenswrapper[4790]: E0406 11:58:28.904002 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.004505 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.104948 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.205409 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.266048 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.307668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.307706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.307717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.307733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.307745 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.409591 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.409619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.409627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.409639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.409648 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.512190 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.512236 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.512270 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.512289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.512302 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.615219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.615274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.615293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.615317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.615338 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.660087 4790 apiserver.go:52] "Watching apiserver" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.667706 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.668120 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.668642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.668725 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.668913 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.669233 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.669311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.669329 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.669469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.669738 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.669868 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672145 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672221 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672316 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672889 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.672976 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.673623 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.674254 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.674801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.697353 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.715378 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.717622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.717712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.717731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.717752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.717767 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.719486 4790 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.726957 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.735911 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.744621 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752083 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752131 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752167 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752185 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752219 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752269 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752385 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752406 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752422 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752471 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752486 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752500 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752517 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752532 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752562 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752582 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752601 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752615 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752639 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752923 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753121 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.752649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753217 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753217 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753318 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753359 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753419 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753474 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753492 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753510 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753530 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753549 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753567 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753582 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753663 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753725 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753756 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753772 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753793 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753880 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.753911 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754056 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754161 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754228 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754270 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754327 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754345 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754381 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754418 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754473 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754489 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754522 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754538 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754553 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754568 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754584 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754599 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754614 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754662 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754787 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754802 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754851 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754888 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754922 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754938 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754954 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754972 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754989 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755006 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755070 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755087 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755137 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755152 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755167 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755183 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755198 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755214 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755246 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755262 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755277 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755293 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755420 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755449 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755484 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755502 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755518 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755569 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755602 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755623 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755656 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755672 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755738 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755758 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755775 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755792 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755811 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755987 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756034 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756059 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756082 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756104 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756127 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756149 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756196 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756218 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756398 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756417 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756451 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756469 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756490 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756507 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756527 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756546 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756564 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756583 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756601 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756653 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756676 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756701 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756777 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756825 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756889 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756909 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756931 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757188 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757337 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757434 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757460 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757480 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757500 4790 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757519 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757539 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757560 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757582 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757601 4790 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757642 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.758740 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.759354 4790 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754236 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754492 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754618 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754626 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754679 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754919 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754925 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754953 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.754803 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.755919 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756032 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756053 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756246 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756334 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756372 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756556 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756668 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.756755 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757652 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757819 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757869 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.757916 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.758112 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.758180 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.758197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.760235 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.760734 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.760799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.760874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.760969 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.762242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761110 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761557 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.761673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.763267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.763560 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.764101 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:30.264079343 +0000 UTC m=+89.251822279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.764096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.764210 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.764388 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.764391 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.764815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.765031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.765719 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.766097 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.766221 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.767052 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.767090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.766816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768084 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768281 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768315 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.768756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.768845 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.768889 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:30.268876433 +0000 UTC m=+89.256619299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.769605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.769618 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.769847 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.769981 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.770085 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.769418 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.770262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.770721 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.770945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.771052 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.771301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.771331 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.771459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.771741 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.772009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.772300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.772665 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.772750 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.773128 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.773434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.773662 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.773927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.774172 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.774329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.774564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.774727 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.775135 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.775233 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.775637 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.775722 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:30.275548839 +0000 UTC m=+89.263291705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776065 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.775404 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776681 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776596 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.776912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.777163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.777246 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.777355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778039 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.778946 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779165 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779381 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779606 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.779982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.780013 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.780954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.787772 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.788167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.788199 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.788241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.788434 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.788533 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.788633 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.788770 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:30.288749919 +0000 UTC m=+89.276492795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.788542 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.789351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.789602 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.789657 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.789657 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.789923 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.790078 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.790146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.790314 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.790370 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.790611 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.790632 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.790643 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:29 crc kubenswrapper[4790]: E0406 11:58:29.790689 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:30.290673647 +0000 UTC m=+89.278416513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791053 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791378 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.791315 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.792087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.793049 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.793445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.793712 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.793997 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.794229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.794317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.797246 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.797249 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.797335 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.797544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.799916 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.800486 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.800517 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.801245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.801279 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.801669 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.801286 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.801934 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.802125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.802773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.802856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.802911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.803046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.803339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.803474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.804161 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.804687 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805504 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805656 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.805903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807056 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807184 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807231 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807313 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.807958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808083 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808115 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808292 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808345 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.808946 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.819094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.821321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.821354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.821362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.821374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.821384 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.822424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.830717 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.835625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.858868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.858931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859031 4790 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859044 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859071 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859084 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859077 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859094 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859143 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859152 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859161 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859171 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859180 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859188 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859197 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859205 4790 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859214 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859224 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859231 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859239 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859246 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859254 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859262 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859290 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859323 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859334 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859348 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859360 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859370 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859379 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859389 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859399 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859408 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859416 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859425 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859435 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859444 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859453 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859461 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859470 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859480 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859488 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859497 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859506 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859514 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859523 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859533 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859542 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859552 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859560 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859569 4790 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859578 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859587 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859595 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859603 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859613 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859630 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859638 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859647 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859655 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859663 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859671 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859680 4790 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859688 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859695 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859704 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859712 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859720 4790 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859750 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859760 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859769 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859777 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859785 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859793 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859801 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859810 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859819 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859845 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859862 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859870 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859877 4790 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859885 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859892 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859900 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859908 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859917 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859925 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859933 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859941 4790 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859950 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859958 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859967 4790 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859975 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859984 4790 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.859992 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860001 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860010 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860018 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860027 4790 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860058 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860066 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860074 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860082 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860090 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860101 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860108 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860116 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860124 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860133 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860141 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860150 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860158 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860166 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860174 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860182 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860191 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860199 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860208 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860216 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860224 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860233 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860241 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860250 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860258 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860266 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860276 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860286 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860294 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860312 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860320 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860328 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860336 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860343 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860352 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860360 4790 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860368 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860377 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860386 4790 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860394 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860401 4790 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860410 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860419 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860426 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860435 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860443 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860450 4790 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860458 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860467 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860475 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860482 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860490 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860498 4790 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860506 4790 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860514 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860522 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860529 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860537 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860547 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860555 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860562 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860570 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860577 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860585 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860592 4790 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860607 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860615 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860624 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860632 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860640 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860648 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860656 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860664 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860672 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860680 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860689 4790 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860697 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860706 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860714 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860722 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860731 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860739 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860747 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860755 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.860763 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.924175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.924248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.924268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.924294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.924315 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:29Z","lastTransitionTime":"2026-04-06T11:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:29 crc kubenswrapper[4790]: I0406 11:58:29.996253 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.004513 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.014060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.026857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.026903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.026920 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.026943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.026961 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.053585 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1ac0959214bd4e09ff358a6311d8c74ab55c167a6222af3800944f08241bc8cc"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.054926 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"77807972bf58f96256e695cf80efc738143bea09ae38fab279b503bf9cef1274"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.055989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5be35e3791cbfb02996024e5035dd3ad7d8a273405bb8178942a32658d49cf74"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.129391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.129769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.129779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.129793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.129802 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.231663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.231697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.231706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.231720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.231729 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.264973 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.265134 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:31.265112801 +0000 UTC m=+90.252855677 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.334027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.334061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.334070 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.334083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.334092 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.366153 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.366208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.366226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.366244 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366396 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366415 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366409 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366457 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366508 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:31.366490941 +0000 UTC m=+90.354233807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366425 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366877 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366902 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.366924 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.367578 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:31.366573123 +0000 UTC m=+90.354316009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.367652 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:31.36763692 +0000 UTC m=+90.355379876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:30 crc kubenswrapper[4790]: E0406 11:58:30.367677 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:31.367668281 +0000 UTC m=+90.355411247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.437854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.437923 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.437940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.437962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.437977 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.545474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.545514 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.545524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.545539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.545550 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.648658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.648702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.648716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.648733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.648745 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.751760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.751806 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.751817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.751865 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.751883 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.854185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.854238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.854247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.854264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.854274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.957214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.957281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.957299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.957322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:30 crc kubenswrapper[4790]: I0406 11:58:30.957338 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:30Z","lastTransitionTime":"2026-04-06T11:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.059854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.059897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.059908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.059932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.059947 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.061474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d040798c407b93513cb9c8523c47cfc037d0e4c6537cf799aa52fac2c2af6df6"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.061520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0eec965e4e27816916638cfde28bdf81bf4d7336eef8d3c29736480eac8bf97b"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.062781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.086887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.100615 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.118153 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d040798c407b93513cb9c8523c47cfc037d0e4c6537cf799aa52fac2c2af6df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eec965e4e27816916638cfde28bdf81bf4d7336eef8d3c29736480eac8bf97b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.135275 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.148729 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.161773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.161812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.161823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.161861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.161872 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.162318 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.176226 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.189267 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.201348 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.214053 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.226046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d040798c407b93513cb9c8523c47cfc037d0e4c6537cf799aa52fac2c2af6df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eec965e4e27816916638cfde28bdf81bf4d7336eef8d3c29736480eac8bf97b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.239479 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.264323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.264359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.264366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.264380 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.264388 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.274689 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.274878 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:33.274856367 +0000 UTC m=+92.262599243 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.367076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.367111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.367123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.367142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.367154 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.376061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.376116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.376136 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.376156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376258 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376302 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:33.376288849 +0000 UTC m=+92.364031715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376644 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376664 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376674 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376699 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:33.376692309 +0000 UTC m=+92.364435175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376737 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376759 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:33.376754401 +0000 UTC m=+92.364497267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376794 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376802 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376808 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.376845 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:33.376820112 +0000 UTC m=+92.364562978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.469729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.469770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.469785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.469805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.469818 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.572186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.572228 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.572239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.572255 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.572269 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674690 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674728 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674741 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674787 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.674745 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.674863 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.675046 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.675113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:31 crc kubenswrapper[4790]: E0406 11:58:31.675193 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.680596 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.681398 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.682527 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.683321 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.684491 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.685178 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.685818 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.686843 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.687480 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.688577 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.688888 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.689069 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.691124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.691581 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.692093 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.692990 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.693476 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.694356 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.694755 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.695324 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.696273 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.696697 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.697725 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.698148 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.699166 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.699549 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.700132 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.701178 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.701631 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.702700 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.703153 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.703291 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.704124 4790 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.704227 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.706637 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.707623 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.708025 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.709644 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.710271 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.711153 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.711763 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.712775 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.713355 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.714750 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.715535 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.716619 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.716951 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.717160 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.718055 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.718549 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.719967 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.720436 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.721253 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.721692 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.722620 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.723391 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.723864 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.730074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.741968 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d040798c407b93513cb9c8523c47cfc037d0e4c6537cf799aa52fac2c2af6df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eec965e4e27816916638cfde28bdf81bf4d7336eef8d3c29736480eac8bf97b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-04-06T11:58:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.753544 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:31Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.778206 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.778244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.778254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.778272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.778283 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.880980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.881027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.881040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.881060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.881075 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.983457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.983520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.983535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.983554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:31 crc kubenswrapper[4790]: I0406 11:58:31.983567 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:31Z","lastTransitionTime":"2026-04-06T11:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.085396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.085445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.085463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.085482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.085498 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.093037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.093085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.093101 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.093117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.093128 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.106025 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:32Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.109743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.109795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.109807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.109853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.109871 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.123486 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:32Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.127350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.127412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.127428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.127447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.127460 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.142466 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:32Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.146773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.146884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.146918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.146941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.146960 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.165536 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:32Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.170193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.170264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.170292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.170323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.170345 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.186444 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-04-06T11:58:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"981c1c09-3ef4-430f-bdd7-a6309afbd803\\\",\\\"systemUUID\\\":\\\"81d93d39-022f-479c-bd72-2a6f59eabad5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-04-06T11:58:32Z is after 2025-08-24T17:21:41Z" Apr 06 11:58:32 crc kubenswrapper[4790]: E0406 11:58:32.186574 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.188230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.188250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.188258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.188271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.188280 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.290852 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.290886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.290894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.290907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.290916 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.393107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.393156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.393170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.393185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.393199 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.495879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.495958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.495975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.495999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.496030 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.598578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.598622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.598634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.598650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.598659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.701236 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.701286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.701299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.701315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.701328 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.804237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.804276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.804288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.804305 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.804322 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.907451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.907504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.907519 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.907537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:32 crc kubenswrapper[4790]: I0406 11:58:32.907549 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:32Z","lastTransitionTime":"2026-04-06T11:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.009504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.009552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.009565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.009582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.009593 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.069756 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5186118447aba3ec36131bef86e9cf87f8a5176b63d345354449c8f5a02e3904"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.111925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.111962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.111972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.111988 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.111997 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.214343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.214397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.214408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.214420 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.214431 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.291559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.291781 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:37.291740916 +0000 UTC m=+96.279483782 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.317340 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.317374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.317384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.317398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.317407 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.392849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.392895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.392918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.392940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393045 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393059 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393069 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393086 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393127 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:37.393110817 +0000 UTC m=+96.380853683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393208 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393243 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:37.39322338 +0000 UTC m=+96.380966246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393206 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393313 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:37.393294342 +0000 UTC m=+96.381037278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393249 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393340 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.393376 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:37.393367803 +0000 UTC m=+96.381110869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.420072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.420122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.420134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.420153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.420166 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.522083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.522127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.522139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.522155 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.522167 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.624871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.624918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.624933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.624949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.624964 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.674654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.674682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.674780 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.674796 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.674892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:33 crc kubenswrapper[4790]: E0406 11:58:33.674946 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.727477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.727525 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.727538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.727559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.727570 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.829476 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.829514 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.829526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.829541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.829552 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.932288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.932321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.932329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.932343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:33 crc kubenswrapper[4790]: I0406 11:58:33.932352 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:33Z","lastTransitionTime":"2026-04-06T11:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.034086 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.034114 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.034122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.034138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.034148 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.136472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.136519 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.136531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.136551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.136562 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.238745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.238787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.238799 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.238814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.238854 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.341388 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.341441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.341451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.341466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.341480 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.444204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.444238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.444246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.444263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.444275 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.546376 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.546449 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.546462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.546479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.546492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.649328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.649369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.649378 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.649391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.649400 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.751965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.752050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.752064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.752081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.752092 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.854497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.854551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.854566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.854587 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.854604 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.957055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.957094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.957104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.957120 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:34 crc kubenswrapper[4790]: I0406 11:58:34.957132 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:34Z","lastTransitionTime":"2026-04-06T11:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.059627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.059672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.059682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.059697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.059708 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.162099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.162155 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.162165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.162178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.162188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.264977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.265020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.265031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.265046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.265061 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.367515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.367551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.367559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.367580 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.367598 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.469458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.469505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.469580 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.469627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.469640 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.572542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.572608 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.572619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.572658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.572669 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674366 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674387 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.674652 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: E0406 11:58:35.674662 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:35 crc kubenswrapper[4790]: E0406 11:58:35.674750 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:35 crc kubenswrapper[4790]: E0406 11:58:35.674890 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.689850 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.690122 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:35 crc kubenswrapper[4790]: E0406 11:58:35.690445 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.777701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.777777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.777800 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.777866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.777884 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.880313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.880353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.880380 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.880397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.880407 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.982621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.982655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.982663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.982680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:35 crc kubenswrapper[4790]: I0406 11:58:35.982689 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:35Z","lastTransitionTime":"2026-04-06T11:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.077122 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:36 crc kubenswrapper[4790]: E0406 11:58:36.077297 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.084760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.084809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.084845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.084868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.084882 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.187633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.187711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.187732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.187756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.187772 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.290589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.290658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.290680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.290708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.290726 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.392850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.392883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.392896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.392912 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.392924 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.494857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.494881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.494889 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.494901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.494909 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.597153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.597202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.597214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.597231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.597242 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.699729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.699869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.699890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.699922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.699938 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.802599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.802639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.802648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.802710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.802718 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.904822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.904905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.904920 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.904941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:36 crc kubenswrapper[4790]: I0406 11:58:36.904957 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:36Z","lastTransitionTime":"2026-04-06T11:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.007412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.007468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.007481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.007501 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.007513 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.109885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.109931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.109939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.109956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.109967 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.211896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.211940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.211949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.211965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.211974 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.314639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.314695 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.314706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.314727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.314739 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.329279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.329500 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:45.329471633 +0000 UTC m=+104.317214519 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.417460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.417507 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.417521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.417541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.417552 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.430455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.430525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.430571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.430632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430651 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430707 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430744 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430748 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430800 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430758 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430728 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:45.430707421 +0000 UTC m=+104.418450307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.430858 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.431037 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:45.431015188 +0000 UTC m=+104.418758064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.431038 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.431063 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:45.431054759 +0000 UTC m=+104.418797635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.431087 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:45.43107368 +0000 UTC m=+104.418816556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.519887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.519939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.519953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.519971 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.519981 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.621938 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.621997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.622016 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.622038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.622056 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.674546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.674683 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.674712 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.674742 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.674788 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:37 crc kubenswrapper[4790]: E0406 11:58:37.674924 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.724791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.724868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.724877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.724894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.724905 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.826668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.826729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.826738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.826757 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.826768 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.928848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.928886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.928901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.928918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:37 crc kubenswrapper[4790]: I0406 11:58:37.928929 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:37Z","lastTransitionTime":"2026-04-06T11:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.031049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.031080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.031088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.031101 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.031109 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.133266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.133306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.133315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.133330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.133341 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.236230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.236272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.236284 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.236300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.236313 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.338390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.338421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.338428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.338441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.338451 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.440394 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.440435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.440447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.440463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.440473 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.543060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.543116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.543127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.543162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.543174 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.645650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.645692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.645711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.645729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.645743 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.747729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.747757 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.747766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.747778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.747787 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.849977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.850013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.850021 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.850038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.850048 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.951950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.951982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.951991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.952007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:38 crc kubenswrapper[4790]: I0406 11:58:38.952020 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:38Z","lastTransitionTime":"2026-04-06T11:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.054993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.055032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.055041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.055054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.055064 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.157684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.157713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.157721 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.157733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.157742 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.260482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.260535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.260547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.260563 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.260575 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.362880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.362908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.362916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.362927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.362936 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.383890 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fxqdg"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.384615 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.388617 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.390064 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.390208 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.403215 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dskdf"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.403617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.407520 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-9p96t"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.408078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-764h8"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.408878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.409023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412424 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412420 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412700 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412745 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412752 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412912 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.412939 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.413081 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.414758 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.415056 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.415077 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.416770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.422768 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5c5h9"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.423900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430201 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430265 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430351 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430393 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430561 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430731 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.430756 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.447504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4vlf\" (UniqueName: \"kubernetes.io/projected/f2355ff5-ea31-472c-9ad9-b4f9f108849a-kube-api-access-s4vlf\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.447552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2355ff5-ea31-472c-9ad9-b4f9f108849a-hosts-file\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.470730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.470769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.470783 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.470804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.470817 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548690 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548738 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cnibin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5e33f8-0490-4219-8c40-526903de8e6f-mcd-auth-proxy-config\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548820 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-conf-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-multus-certs\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2355ff5-ea31-472c-9ad9-b4f9f108849a-hosts-file\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-cnibin\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-kubelet\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.548985 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-daemon-config\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2355ff5-ea31-472c-9ad9-b4f9f108849a-hosts-file\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5e33f8-0490-4219-8c40-526903de8e6f-proxy-tls\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549085 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-socket-dir-parent\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549114 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-netns\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-multus\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549242 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964ck\" (UniqueName: \"kubernetes.io/projected/9f5e33f8-0490-4219-8c40-526903de8e6f-kube-api-access-964ck\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549360 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-etc-kubernetes\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549375 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-binary-copy\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549438 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cni-binary-copy\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-k8s-cni-cncf-io\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549489 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-system-cni-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpxf\" (UniqueName: \"kubernetes.io/projected/37199c14-02b6-4c25-be2a-674701cf0382-kube-api-access-bgpxf\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78n6\" (UniqueName: \"kubernetes.io/projected/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-kube-api-access-p78n6\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4vlf\" (UniqueName: \"kubernetes.io/projected/f2355ff5-ea31-472c-9ad9-b4f9f108849a-kube-api-access-s4vlf\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549586 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549640 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5cd\" (UniqueName: \"kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-os-release\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549696 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-system-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549732 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-os-release\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5e33f8-0490-4219-8c40-526903de8e6f-rootfs\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-bin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-hostroot\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549806 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-tuning-conf-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549885 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.549908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.568873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4vlf\" (UniqueName: \"kubernetes.io/projected/f2355ff5-ea31-472c-9ad9-b4f9f108849a-kube-api-access-s4vlf\") pod \"node-resolver-fxqdg\" (UID: \"f2355ff5-ea31-472c-9ad9-b4f9f108849a\") " pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.573648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.574112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.574123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.574136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.574146 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.608698 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sq4sm"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.609005 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.610455 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.610504 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.610589 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.610909 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.650875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-k8s-cni-cncf-io\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.650915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.650936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-binary-copy\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.650957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cni-binary-copy\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.650981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-system-cni-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpxf\" (UniqueName: \"kubernetes.io/projected/37199c14-02b6-4c25-be2a-674701cf0382-kube-api-access-bgpxf\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78n6\" (UniqueName: \"kubernetes.io/projected/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-kube-api-access-p78n6\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-system-cni-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-k8s-cni-cncf-io\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651194 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5cd\" (UniqueName: \"kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651280 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-os-release\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-system-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651323 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-os-release\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-os-release\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-tuning-conf-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5e33f8-0490-4219-8c40-526903de8e6f-rootfs\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-bin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651930 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-hostroot\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-hostroot\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5e33f8-0490-4219-8c40-526903de8e6f-rootfs\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-bin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-system-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.651994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-os-release\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-binary-copy\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cni-binary-copy\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652436 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cnibin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652450 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-cnibin\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652537 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-multus-certs\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5e33f8-0490-4219-8c40-526903de8e6f-mcd-auth-proxy-config\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652663 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652692 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-conf-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-kubelet\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652780 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-cnibin\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652814 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-daemon-config\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5e33f8-0490-4219-8c40-526903de8e6f-proxy-tls\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-socket-dir-parent\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.652982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-netns\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-multus\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964ck\" (UniqueName: \"kubernetes.io/projected/9f5e33f8-0490-4219-8c40-526903de8e6f-kube-api-access-964ck\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653128 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37199c14-02b6-4c25-be2a-674701cf0382-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-etc-kubernetes\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-etc-kubernetes\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653205 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-multus-certs\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.653725 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5e33f8-0490-4219-8c40-526903de8e6f-mcd-auth-proxy-config\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-cni-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-kubelet\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-conf-dir\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-var-lib-cni-multus\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-cnibin\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654606 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37199c14-02b6-4c25-be2a-674701cf0382-tuning-conf-dir\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654644 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654690 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-host-run-netns\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.654699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-socket-dir-parent\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.655229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.655271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-multus-daemon-config\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.659210 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.661114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5e33f8-0490-4219-8c40-526903de8e6f-proxy-tls\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.671165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5cd\" (UniqueName: \"kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd\") pod \"ovnkube-node-5c5h9\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.671983 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78n6\" (UniqueName: \"kubernetes.io/projected/d912ce2d-76e2-4f0a-ae77-91adf71ddfc0-kube-api-access-p78n6\") pod \"multus-dskdf\" (UID: \"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0\") " pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.674632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.674678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.674801 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.674889 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.675050 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.674949 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.674901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpxf\" (UniqueName: \"kubernetes.io/projected/37199c14-02b6-4c25-be2a-674701cf0382-kube-api-access-bgpxf\") pod \"multus-additional-cni-plugins-764h8\" (UID: \"37199c14-02b6-4c25-be2a-674701cf0382\") " pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.675252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964ck\" (UniqueName: \"kubernetes.io/projected/9f5e33f8-0490-4219-8c40-526903de8e6f-kube-api-access-964ck\") pod \"machine-config-daemon-9p96t\" (UID: \"9f5e33f8-0490-4219-8c40-526903de8e6f\") " pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.676890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.676994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.677293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.677383 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.677735 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.706676 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fxqdg" Apr 06 11:58:39 crc kubenswrapper[4790]: W0406 11:58:39.721474 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2355ff5_ea31_472c_9ad9_b4f9f108849a.slice/crio-06fc0654be72fd66c9cb8d8f45fc1d7998420add7e71bc2f2050b33a0c6a57ef WatchSource:0}: Error finding container 06fc0654be72fd66c9cb8d8f45fc1d7998420add7e71bc2f2050b33a0c6a57ef: Status 404 returned error can't find the container with id 06fc0654be72fd66c9cb8d8f45fc1d7998420add7e71bc2f2050b33a0c6a57ef Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.730986 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dskdf" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.742277 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-764h8" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.742938 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.743317 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.745840 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.746200 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 06 11:58:39 crc kubenswrapper[4790]: W0406 11:58:39.747070 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd912ce2d_76e2_4f0a_ae77_91adf71ddfc0.slice/crio-284821fac84cd0f0d3c01389ec426dfacb8b8c8eb236c6f14f24c48d6a91adf4 WatchSource:0}: Error finding container 284821fac84cd0f0d3c01389ec426dfacb8b8c8eb236c6f14f24c48d6a91adf4: Status 404 returned error can't find the container with id 284821fac84cd0f0d3c01389ec426dfacb8b8c8eb236c6f14f24c48d6a91adf4 Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.752455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.754286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7vw6\" (UniqueName: \"kubernetes.io/projected/542f030d-a49b-4885-be73-8274bed222f1-kube-api-access-l7vw6\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.754327 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/542f030d-a49b-4885-be73-8274bed222f1-host\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.754386 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/542f030d-a49b-4885-be73-8274bed222f1-serviceca\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: W0406 11:58:39.755152 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37199c14_02b6_4c25_be2a_674701cf0382.slice/crio-159dc71e8105b49f7bce8979e3739c9ad671bd5fbf400ce497239780e1f5d5e5 WatchSource:0}: Error finding container 159dc71e8105b49f7bce8979e3739c9ad671bd5fbf400ce497239780e1f5d5e5: Status 404 returned error can't find the container with id 159dc71e8105b49f7bce8979e3739c9ad671bd5fbf400ce497239780e1f5d5e5 Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.759389 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.763399 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qkf8s"] Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.763747 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.763796 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.781323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.781365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.781376 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.781392 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.781403 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: W0406 11:58:39.782195 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5e33f8_0490_4219_8c40_526903de8e6f.slice/crio-63a14e100a80279cf0d775be03245235b94a6a50829d588c4ef677f6ec0066b7 WatchSource:0}: Error finding container 63a14e100a80279cf0d775be03245235b94a6a50829d588c4ef677f6ec0066b7: Status 404 returned error can't find the container with id 63a14e100a80279cf0d775be03245235b94a6a50829d588c4ef677f6ec0066b7 Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc9a262d-87e9-4d9b-8064-489483debe3f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/542f030d-a49b-4885-be73-8274bed222f1-serviceca\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855327 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2dd\" (UniqueName: \"kubernetes.io/projected/bc9a262d-87e9-4d9b-8064-489483debe3f-kube-api-access-gg2dd\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855349 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csw6j\" (UniqueName: \"kubernetes.io/projected/934f4d5f-3670-40da-b496-8b9f9f25fc0b-kube-api-access-csw6j\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7vw6\" (UniqueName: \"kubernetes.io/projected/542f030d-a49b-4885-be73-8274bed222f1-kube-api-access-l7vw6\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/542f030d-a49b-4885-be73-8274bed222f1-host\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.855508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/542f030d-a49b-4885-be73-8274bed222f1-host\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.856588 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/542f030d-a49b-4885-be73-8274bed222f1-serviceca\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.875731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7vw6\" (UniqueName: \"kubernetes.io/projected/542f030d-a49b-4885-be73-8274bed222f1-kube-api-access-l7vw6\") pod \"node-ca-sq4sm\" (UID: \"542f030d-a49b-4885-be73-8274bed222f1\") " pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.883011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.883040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.883048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.883061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.883070 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.920189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sq4sm" Apr 06 11:58:39 crc kubenswrapper[4790]: W0406 11:58:39.942664 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542f030d_a49b_4885_be73_8274bed222f1.slice/crio-59fe7110bd9204755e9e7bbe83c2e8053e5ea5155c7b667baadcd09ba93482fe WatchSource:0}: Error finding container 59fe7110bd9204755e9e7bbe83c2e8053e5ea5155c7b667baadcd09ba93482fe: Status 404 returned error can't find the container with id 59fe7110bd9204755e9e7bbe83c2e8053e5ea5155c7b667baadcd09ba93482fe Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956651 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc9a262d-87e9-4d9b-8064-489483debe3f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956805 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2dd\" (UniqueName: \"kubernetes.io/projected/bc9a262d-87e9-4d9b-8064-489483debe3f-kube-api-access-gg2dd\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956859 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csw6j\" (UniqueName: \"kubernetes.io/projected/934f4d5f-3670-40da-b496-8b9f9f25fc0b-kube-api-access-csw6j\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.956896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.957015 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:39 crc kubenswrapper[4790]: E0406 11:58:39.957063 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs podName:934f4d5f-3670-40da-b496-8b9f9f25fc0b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:40.457049407 +0000 UTC m=+99.444792273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs") pod "network-metrics-daemon-qkf8s" (UID: "934f4d5f-3670-40da-b496-8b9f9f25fc0b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.957519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.958301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc9a262d-87e9-4d9b-8064-489483debe3f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.960591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc9a262d-87e9-4d9b-8064-489483debe3f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.974903 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2dd\" (UniqueName: \"kubernetes.io/projected/bc9a262d-87e9-4d9b-8064-489483debe3f-kube-api-access-gg2dd\") pod \"ovnkube-control-plane-749d76644c-98vqt\" (UID: \"bc9a262d-87e9-4d9b-8064-489483debe3f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.975342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csw6j\" (UniqueName: \"kubernetes.io/projected/934f4d5f-3670-40da-b496-8b9f9f25fc0b-kube-api-access-csw6j\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.987092 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.987124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.987132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.987146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:39 crc kubenswrapper[4790]: I0406 11:58:39.987155 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:39Z","lastTransitionTime":"2026-04-06T11:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.063372 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" Apr 06 11:58:40 crc kubenswrapper[4790]: W0406 11:58:40.075481 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9a262d_87e9_4d9b_8064_489483debe3f.slice/crio-62fa0b930bf69aa77729656ac93987426f90ebff863e0816984da8614833453a WatchSource:0}: Error finding container 62fa0b930bf69aa77729656ac93987426f90ebff863e0816984da8614833453a: Status 404 returned error can't find the container with id 62fa0b930bf69aa77729656ac93987426f90ebff863e0816984da8614833453a Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.087070 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sq4sm" event={"ID":"542f030d-a49b-4885-be73-8274bed222f1","Type":"ContainerStarted","Data":"59fe7110bd9204755e9e7bbe83c2e8053e5ea5155c7b667baadcd09ba93482fe"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" event={"ID":"bc9a262d-87e9-4d9b-8064-489483debe3f","Type":"ContainerStarted","Data":"62fa0b930bf69aa77729656ac93987426f90ebff863e0816984da8614833453a"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.088968 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.091245 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.091284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"63a14e100a80279cf0d775be03245235b94a6a50829d588c4ef677f6ec0066b7"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.092420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdf" event={"ID":"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0","Type":"ContainerStarted","Data":"85983ff10b47fb8052c360f6676528d359e3f3bc9376a17b6137543728ee190b"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.092446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdf" event={"ID":"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0","Type":"ContainerStarted","Data":"284821fac84cd0f0d3c01389ec426dfacb8b8c8eb236c6f14f24c48d6a91adf4"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.097404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fxqdg" event={"ID":"f2355ff5-ea31-472c-9ad9-b4f9f108849a","Type":"ContainerStarted","Data":"1c62dc73f2d4a48bc04b9422a3ee271cd2aff5f2054cf78ceba399556a4cbf84"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.097457 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fxqdg" event={"ID":"f2355ff5-ea31-472c-9ad9-b4f9f108849a","Type":"ContainerStarted","Data":"06fc0654be72fd66c9cb8d8f45fc1d7998420add7e71bc2f2050b33a0c6a57ef"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.099873 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" exitCode=0 Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.099919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.099936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"dd68895988583af3c2840fed35c366a402a23711879ff2077e1804c4643d70b7"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.104353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerStarted","Data":"4e39f196f6524c84677d6c9d2d8467e15057680140b1ad693452fa62ae9bc5f0"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.104430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerStarted","Data":"159dc71e8105b49f7bce8979e3739c9ad671bd5fbf400ce497239780e1f5d5e5"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.111107 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dskdf" podStartSLOduration=64.111079092 podStartE2EDuration="1m4.111079092s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:40.109615726 +0000 UTC m=+99.097358592" watchObservedRunningTime="2026-04-06 11:58:40.111079092 +0000 UTC m=+99.098821968" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.180239 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fxqdg" podStartSLOduration=64.180221128 podStartE2EDuration="1m4.180221128s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:40.179790428 +0000 UTC m=+99.167533294" watchObservedRunningTime="2026-04-06 11:58:40.180221128 +0000 UTC m=+99.167963994" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.190991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.191017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.191028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.191043 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.191052 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.293910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.294390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.294400 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.294414 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.294423 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.398572 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.398622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.398634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.398653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.398666 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.464379 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:40 crc kubenswrapper[4790]: E0406 11:58:40.464560 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:40 crc kubenswrapper[4790]: E0406 11:58:40.464652 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs podName:934f4d5f-3670-40da-b496-8b9f9f25fc0b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:41.464631968 +0000 UTC m=+100.452374844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs") pod "network-metrics-daemon-qkf8s" (UID: "934f4d5f-3670-40da-b496-8b9f9f25fc0b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.502256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.502331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.502346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.502371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.502383 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.608977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.609035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.609050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.609071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.609087 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.711615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.711668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.711681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.711698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.711709 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.813759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.813806 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.813816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.813848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.813858 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.916700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.916731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.916739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.916753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.916767 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:40Z","lastTransitionTime":"2026-04-06T11:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:40 crc kubenswrapper[4790]: I0406 11:58:40.954054 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.019901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.019947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.019958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.019974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.019992 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.108550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sq4sm" event={"ID":"542f030d-a49b-4885-be73-8274bed222f1","Type":"ContainerStarted","Data":"4aca0a5ae3eeebfaff32eb264684d8e0fc317eda4058379c754c7b10db5b2378"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.111773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" event={"ID":"bc9a262d-87e9-4d9b-8064-489483debe3f","Type":"ContainerStarted","Data":"f54bb3aae5be5b74f0af24b28a31dd61a32fcad679ff4adb25efcfc75998e77c"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.111857 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" event={"ID":"bc9a262d-87e9-4d9b-8064-489483debe3f","Type":"ContainerStarted","Data":"57f6508c8373a030d72eb3b8254938dcd1cd6aa0f617fd485eec0dad2578d73c"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.115174 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"77f29fd4f9528d691ffc1c8fac82303831e099f5c9a2f18f9b0c4e42a0e4364b"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.117970 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.118010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.118027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.118042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.118059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.120052 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="4e39f196f6524c84677d6c9d2d8467e15057680140b1ad693452fa62ae9bc5f0" exitCode=0 Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.120086 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"4e39f196f6524c84677d6c9d2d8467e15057680140b1ad693452fa62ae9bc5f0"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.121680 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sq4sm" podStartSLOduration=65.121666501 podStartE2EDuration="1m5.121666501s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:41.12081825 +0000 UTC m=+100.108561116" watchObservedRunningTime="2026-04-06 11:58:41.121666501 +0000 UTC m=+100.109409377" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.121969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.121989 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.122000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.122014 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.122025 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.134809 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-98vqt" podStartSLOduration=64.134788128 podStartE2EDuration="1m4.134788128s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:41.133956688 +0000 UTC m=+100.121699574" watchObservedRunningTime="2026-04-06 11:58:41.134788128 +0000 UTC m=+100.122531004" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.174728 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podStartSLOduration=65.174709495 podStartE2EDuration="1m5.174709495s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:41.152235624 +0000 UTC m=+100.139978510" watchObservedRunningTime="2026-04-06 11:58:41.174709495 +0000 UTC m=+100.162452361" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.223532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.223570 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.223579 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.223592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.223602 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.325620 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.326248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.326261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.326278 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.326293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.429769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.429844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.429859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.429882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.429896 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.476575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.476695 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.476750 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs podName:934f4d5f-3670-40da-b496-8b9f9f25fc0b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:43.476736924 +0000 UTC m=+102.464479790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs") pod "network-metrics-daemon-qkf8s" (UID: "934f4d5f-3670-40da-b496-8b9f9f25fc0b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.533562 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.533612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.533626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.533649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.533665 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.638435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.638485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.638496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.638544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.638564 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.674571 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.674629 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.674594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.674582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.675588 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.675678 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.675726 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:41 crc kubenswrapper[4790]: E0406 11:58:41.675875 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.741387 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.741477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.741497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.741531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.741554 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.844444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.844480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.844492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.844506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.844516 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.947286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.947990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.948032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.948061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:41 crc kubenswrapper[4790]: I0406 11:58:41.948077 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:41Z","lastTransitionTime":"2026-04-06T11:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.050887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.050925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.050936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.050952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.050961 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:42Z","lastTransitionTime":"2026-04-06T11:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.127083 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.128931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerStarted","Data":"d0567d4fd55cb93359ee1e46eb8803b1aa9b61015c18a219636ebb954c7c9138"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.153372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.153435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.153448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.153468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.153483 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:42Z","lastTransitionTime":"2026-04-06T11:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.256516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.256587 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.256619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.256635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.256645 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:42Z","lastTransitionTime":"2026-04-06T11:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.358948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.359208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.359350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.359447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.359536 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:42Z","lastTransitionTime":"2026-04-06T11:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.427991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.428124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.428199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.428265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.428331 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-04-06T11:58:42Z","lastTransitionTime":"2026-04-06T11:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.467785 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6"] Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.468208 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.469996 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.470014 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.470261 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.470971 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.588377 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8259f220-d398-4301-a083-e791a3d29afe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.588686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.588796 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.588999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8259f220-d398-4301-a083-e791a3d29afe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.589108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8259f220-d398-4301-a083-e791a3d29afe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.684108 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691341 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8259f220-d398-4301-a083-e791a3d29afe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8259f220-d398-4301-a083-e791a3d29afe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8259f220-d398-4301-a083-e791a3d29afe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.691570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8259f220-d398-4301-a083-e791a3d29afe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.692170 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.692649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8259f220-d398-4301-a083-e791a3d29afe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.698767 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8259f220-d398-4301-a083-e791a3d29afe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.708662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8259f220-d398-4301-a083-e791a3d29afe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vzzl6\" (UID: \"8259f220-d398-4301-a083-e791a3d29afe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: I0406 11:58:42.780325 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" Apr 06 11:58:42 crc kubenswrapper[4790]: W0406 11:58:42.795538 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8259f220_d398_4301_a083_e791a3d29afe.slice/crio-9aa9251f5c36cd43713d6b37b5f4baeff124cf37a56c2a73152c4d5c7977e47d WatchSource:0}: Error finding container 9aa9251f5c36cd43713d6b37b5f4baeff124cf37a56c2a73152c4d5c7977e47d: Status 404 returned error can't find the container with id 9aa9251f5c36cd43713d6b37b5f4baeff124cf37a56c2a73152c4d5c7977e47d Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.134414 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="d0567d4fd55cb93359ee1e46eb8803b1aa9b61015c18a219636ebb954c7c9138" exitCode=0 Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.134493 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"d0567d4fd55cb93359ee1e46eb8803b1aa9b61015c18a219636ebb954c7c9138"} Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.136162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" event={"ID":"8259f220-d398-4301-a083-e791a3d29afe","Type":"ContainerStarted","Data":"9aa9251f5c36cd43713d6b37b5f4baeff124cf37a56c2a73152c4d5c7977e47d"} Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.498542 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.499102 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.499182 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs podName:934f4d5f-3670-40da-b496-8b9f9f25fc0b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:47.499165082 +0000 UTC m=+106.486907948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs") pod "network-metrics-daemon-qkf8s" (UID: "934f4d5f-3670-40da-b496-8b9f9f25fc0b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.676537 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.676596 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.676679 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.676537 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.676816 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:43 crc kubenswrapper[4790]: I0406 11:58:43.676886 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.676942 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:43 crc kubenswrapper[4790]: E0406 11:58:43.677133 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:44 crc kubenswrapper[4790]: I0406 11:58:44.141672 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="88454fdfa727da261ad29256a1cd8be3bb81cb81ba0c967800cef19ab9f53a49" exitCode=0 Apr 06 11:58:44 crc kubenswrapper[4790]: I0406 11:58:44.141750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"88454fdfa727da261ad29256a1cd8be3bb81cb81ba0c967800cef19ab9f53a49"} Apr 06 11:58:44 crc kubenswrapper[4790]: I0406 11:58:44.147471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} Apr 06 11:58:44 crc kubenswrapper[4790]: I0406 11:58:44.149251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" event={"ID":"8259f220-d398-4301-a083-e791a3d29afe","Type":"ContainerStarted","Data":"9d8380b15b4100983f8a78d431a7030dffb0e3c043d0b39d2faf920f4aa081aa"} Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.155435 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="a95255d7f738c4295a49f676c6c9e3833aded1fdc3e9bcfa33ef32208b1ad88c" exitCode=0 Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.155518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"a95255d7f738c4295a49f676c6c9e3833aded1fdc3e9bcfa33ef32208b1ad88c"} Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.173997 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vzzl6" podStartSLOduration=69.1739772 podStartE2EDuration="1m9.1739772s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:44.182123191 +0000 UTC m=+103.169866057" watchObservedRunningTime="2026-04-06 11:58:45.1739772 +0000 UTC m=+104.161720076" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.418322 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.418581 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.418565636 +0000 UTC m=+120.406308502 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.519274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.519333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.519354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.519375 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519444 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519451 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519484 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.519472776 +0000 UTC m=+120.507215642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519490 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519503 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.519490176 +0000 UTC m=+120.507233052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519517 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519532 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519567 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519582 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519571 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.519558138 +0000 UTC m=+120.507301014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519592 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.519620 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.519611519 +0000 UTC m=+120.507354395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.674623 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.674678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.674732 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:45 crc kubenswrapper[4790]: I0406 11:58:45.674805 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.674983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.675111 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.675197 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:45 crc kubenswrapper[4790]: E0406 11:58:45.675379 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.165041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerStarted","Data":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.165941 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.165956 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.172556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerStarted","Data":"18011a82f6ba43741a25c16c08c355fbd05f7b4cba5c33fd35476109d582e8cc"} Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.203747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:46 crc kubenswrapper[4790]: I0406 11:58:46.208903 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podStartSLOduration=70.208887186 podStartE2EDuration="1m10.208887186s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:46.190275702 +0000 UTC m=+105.178018578" watchObservedRunningTime="2026-04-06 11:58:46.208887186 +0000 UTC m=+105.196630042" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.179798 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="18011a82f6ba43741a25c16c08c355fbd05f7b4cba5c33fd35476109d582e8cc" exitCode=0 Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.179898 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"18011a82f6ba43741a25c16c08c355fbd05f7b4cba5c33fd35476109d582e8cc"} Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.180466 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.253377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.540445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.540674 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.540993 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs podName:934f4d5f-3670-40da-b496-8b9f9f25fc0b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.54097447 +0000 UTC m=+114.528717336 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs") pod "network-metrics-daemon-qkf8s" (UID: "934f4d5f-3670-40da-b496-8b9f9f25fc0b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.675256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.675345 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.675458 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.675461 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.675474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.675648 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.675728 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:47 crc kubenswrapper[4790]: I0406 11:58:47.675978 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 11:58:47 crc kubenswrapper[4790]: E0406 11:58:47.675986 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.185266 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.188233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382"} Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.189357 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.192761 4790 generic.go:334] "Generic (PLEG): container finished" podID="37199c14-02b6-4c25-be2a-674701cf0382" containerID="f1f6e0af9a9270d2f7416be3b1babb8626e11c414ca8bfb56235cbf79742626d" exitCode=0 Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.193602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerDied","Data":"f1f6e0af9a9270d2f7416be3b1babb8626e11c414ca8bfb56235cbf79742626d"} Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.240314 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.240294988 podStartE2EDuration="13.240294988s" podCreationTimestamp="2026-04-06 11:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:48.211272834 +0000 UTC m=+107.199015700" watchObservedRunningTime="2026-04-06 11:58:48.240294988 +0000 UTC m=+107.228037854" Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.267851 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qkf8s"] Apr 06 11:58:48 crc kubenswrapper[4790]: I0406 11:58:48.267976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:48 crc kubenswrapper[4790]: E0406 11:58:48.268057 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.201920 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-764h8" event={"ID":"37199c14-02b6-4c25-be2a-674701cf0382","Type":"ContainerStarted","Data":"16ddace61b2d2df68ce88a2e02e1fd7338f16e8b008fb0d30e155a2f67c83ecc"} Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.221887 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-764h8" podStartSLOduration=73.221871932 podStartE2EDuration="1m13.221871932s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:49.221155794 +0000 UTC m=+108.208898660" watchObservedRunningTime="2026-04-06 11:58:49.221871932 +0000 UTC m=+108.209614798" Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.674658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.674790 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:49 crc kubenswrapper[4790]: E0406 11:58:49.674849 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.674872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:49 crc kubenswrapper[4790]: I0406 11:58:49.674702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:49 crc kubenswrapper[4790]: E0406 11:58:49.674971 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Apr 06 11:58:49 crc kubenswrapper[4790]: E0406 11:58:49.675045 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Apr 06 11:58:49 crc kubenswrapper[4790]: E0406 11:58:49.675092 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qkf8s" podUID="934f4d5f-3670-40da-b496-8b9f9f25fc0b" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.622614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.622773 4790 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.698160 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.698988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.699019 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.699008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.701282 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.701478 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.701550 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.701687 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.703862 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.703876 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.717260 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.717641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.722247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.722441 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.722559 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.722677 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.723018 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.723258 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.723758 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.724270 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.725523 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fr5xt"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.726200 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.726550 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.727345 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.727407 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ql4f9"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.727820 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.728195 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.728143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.735748 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.736400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.738139 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.740954 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.741351 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.741420 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.744566 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.744789 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.746276 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.746734 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.747265 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.747499 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.747923 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.748005 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.748228 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.748966 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749003 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749084 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749163 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749287 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749487 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749664 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749727 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749665 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749810 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749852 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.749993 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750095 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750110 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750222 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750333 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750034 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750038 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750065 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.750801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.751309 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752567 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752868 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752986 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.753081 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.753106 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.753215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.753229 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.752788 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.765646 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.766055 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.766107 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.766962 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.784124 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.784887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.786025 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.786276 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.787019 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.788075 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.788182 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.788300 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.788455 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.788906 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.789168 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.789267 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.790673 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rm8wl"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.790759 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.791479 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.792158 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.793252 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.793938 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.803371 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-prb62"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.793982 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.803320 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.806858 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.807801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.810300 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.810976 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.811105 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dkg9c"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.811143 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.811458 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.811524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.812816 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.813747 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.814275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.823997 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.841300 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.843038 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.843136 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.843248 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.843619 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.845093 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.846137 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.846812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.847747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.848074 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.848642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.843050 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.853268 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.853607 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.856632 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.857306 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-742t2"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.857585 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.857740 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.858250 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-499ss"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.858598 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.858697 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.858942 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.859214 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.859231 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-895qp"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.859286 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.859454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.860156 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.860446 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.860474 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.860786 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.860937 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.861156 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.861304 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.861349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.861438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862003 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862365 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862647 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ggnlt"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.862779 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.863965 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864074 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864162 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864184 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864557 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864596 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.864642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.867458 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.867912 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.868256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.868452 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.868505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.869026 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.869127 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.876728 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.878003 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.879045 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.879669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.879904 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.880377 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.880745 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.881244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.881453 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.881473 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.881984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.882469 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.883792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.885845 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vh9v9"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.886234 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.886921 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zh6m9"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.887723 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.887732 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.888755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.896278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fr5xt"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.897932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.900408 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d787bca-06ac-4e8b-9797-6b25ebbcc706-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903301 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903480 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903511 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-auth-proxy-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-encryption-config\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.903995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lk6h\" (UniqueName: \"kubernetes.io/projected/a88e6363-c095-4450-98d2-18808abd10a9-kube-api-access-5lk6h\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904060 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d787bca-06ac-4e8b-9797-6b25ebbcc706-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a88e6363-c095-4450-98d2-18808abd10a9-audit-dir\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv48j\" (UniqueName: \"kubernetes.io/projected/4d787bca-06ac-4e8b-9797-6b25ebbcc706-kube-api-access-qv48j\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904213 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904384 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44zq\" (UniqueName: \"kubernetes.io/projected/b4a6f2c5-27d9-42d3-8856-3517265316cf-kube-api-access-w44zq\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904409 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904540 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4a6f2c5-27d9-42d3-8856-3517265316cf-metrics-tls\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904728 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-serving-cert\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqhlm\" (UniqueName: \"kubernetes.io/projected/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-kube-api-access-tqhlm\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904856 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-audit-policies\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904895 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r477f\" (UniqueName: \"kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.904976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e5a800d-f198-4dbe-8c7b-a84e6c130041-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905015 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7x5\" (UniqueName: \"kubernetes.io/projected/3c0a5144-8147-439e-a8ab-38fa28c0d96c-kube-api-access-6p7x5\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905059 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44tx\" (UniqueName: \"kubernetes.io/projected/dc2f1a83-c329-47b9-98d3-08104ce2323c-kube-api-access-n44tx\") pod \"downloads-7954f5f757-ql4f9\" (UID: \"dc2f1a83-c329-47b9-98d3-08104ce2323c\") " pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905349 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905390 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhtgf\" (UniqueName: \"kubernetes.io/projected/6e5a800d-f198-4dbe-8c7b-a84e6c130041-kube-api-access-mhtgf\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.905716 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906303 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpct\" (UniqueName: \"kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906341 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chng2\" (UniqueName: \"kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906375 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906400 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906496 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3c0a5144-8147-439e-a8ab-38fa28c0d96c-machine-approver-tls\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906537 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-etcd-client\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h6ch\" (UniqueName: \"kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906696 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-config\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-images\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.906810 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.907872 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ql4f9"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.908929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-499ss"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.909779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rm8wl"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.911225 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.912309 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.913285 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.915346 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.916444 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ggnlt"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.917303 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.918167 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-742t2"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.918965 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.919797 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.920535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.920681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.921473 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.922495 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.925138 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.925166 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.925175 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-prb62"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.935391 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.935442 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lklpx"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.935962 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dkg9c"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.936037 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.938170 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.942984 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.944196 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.944721 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.945023 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.945649 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.947053 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.947713 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.949030 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.949280 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.950383 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9ff6v"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.951338 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.952005 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9ff6v"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.952738 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vh9v9"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.953628 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gz7ps"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.954896 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fpjz5"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.955230 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.955678 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpjz5"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.955725 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.957223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gz7ps"] Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.960877 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 06 11:58:51 crc kubenswrapper[4790]: I0406 11:58:51.981229 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.000775 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-etcd-client\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007793 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h6ch\" (UniqueName: \"kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007816 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fec884c6-276b-4230-a41c-3375cbc2104b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007871 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007924 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007945 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007960 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e14e3a43-8228-4129-a69e-0ecfa3a7114c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.007978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.008003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.008020 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-config\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.008123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb424d82-17ec-4515-903c-96156334ca08-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.008929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl47j\" (UniqueName: \"kubernetes.io/projected/e14e3a43-8228-4129-a69e-0ecfa3a7114c-kube-api-access-nl47j\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743439e2-53c8-4cda-b960-2448c1fb2941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-images\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009147 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.010207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-config\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.009166 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec884c6-276b-4230-a41c-3375cbc2104b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d787bca-06ac-4e8b-9797-6b25ebbcc706-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013779 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013869 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.013979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743439e2-53c8-4cda-b960-2448c1fb2941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8gh\" (UniqueName: \"kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014081 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014086 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-config\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-auth-proxy-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7km\" (UniqueName: \"kubernetes.io/projected/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-kube-api-access-nd7km\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014483 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-encryption-config\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lk6h\" (UniqueName: \"kubernetes.io/projected/a88e6363-c095-4450-98d2-18808abd10a9-kube-api-access-5lk6h\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec884c6-276b-4230-a41c-3375cbc2104b-config\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d787bca-06ac-4e8b-9797-6b25ebbcc706-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014666 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a88e6363-c095-4450-98d2-18808abd10a9-audit-dir\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014756 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv48j\" (UniqueName: \"kubernetes.io/projected/4d787bca-06ac-4e8b-9797-6b25ebbcc706-kube-api-access-qv48j\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014790 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44zq\" (UniqueName: \"kubernetes.io/projected/b4a6f2c5-27d9-42d3-8856-3517265316cf-kube-api-access-w44zq\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.014979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015018 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdvb\" (UniqueName: \"kubernetes.io/projected/bfec50d0-fe3d-45f4-afdb-cba2fc415878-kube-api-access-kfdvb\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015117 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3bc18a-ca66-44f2-9667-48dd85b638fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-etcd-client\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4a6f2c5-27d9-42d3-8856-3517265316cf-metrics-tls\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnfs\" (UniqueName: \"kubernetes.io/projected/35f420a5-ec2f-4d37-94ea-af000df33824-kube-api-access-lvnfs\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqhlm\" (UniqueName: \"kubernetes.io/projected/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-kube-api-access-tqhlm\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jlkp\" (UniqueName: \"kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015367 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-serving-cert\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-service-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r477f\" (UniqueName: \"kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015524 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-audit-policies\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f420a5-ec2f-4d37-94ea-af000df33824-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e5a800d-f198-4dbe-8c7b-a84e6c130041-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7x5\" (UniqueName: \"kubernetes.io/projected/3c0a5144-8147-439e-a8ab-38fa28c0d96c-kube-api-access-6p7x5\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015694 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015727 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-images\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015848 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb424d82-17ec-4515-903c-96156334ca08-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016055 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnh67\" (UniqueName: \"kubernetes.io/projected/743439e2-53c8-4cda-b960-2448c1fb2941-kube-api-access-dnh67\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44tx\" (UniqueName: \"kubernetes.io/projected/dc2f1a83-c329-47b9-98d3-08104ce2323c-kube-api-access-n44tx\") pod \"downloads-7954f5f757-ql4f9\" (UID: \"dc2f1a83-c329-47b9-98d3-08104ce2323c\") " pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhtgf\" (UniqueName: \"kubernetes.io/projected/6e5a800d-f198-4dbe-8c7b-a84e6c130041-kube-api-access-mhtgf\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016227 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfec50d0-fe3d-45f4-afdb-cba2fc415878-serving-cert\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gzmp\" (UniqueName: \"kubernetes.io/projected/4029e155-0c45-49cb-a25b-ddb1f768a88f-kube-api-access-8gzmp\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016316 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpct\" (UniqueName: \"kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chng2\" (UniqueName: \"kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016374 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016393 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5ld5\" (UniqueName: \"kubernetes.io/projected/a3adfac4-b370-4f06-a19e-640252120515-kube-api-access-t5ld5\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016539 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016580 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.016612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5l7p\" (UniqueName: \"kubernetes.io/projected/eb424d82-17ec-4515-903c-96156334ca08-kube-api-access-r5l7p\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.015735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.017291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.017401 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018053 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a88e6363-c095-4450-98d2-18808abd10a9-audit-dir\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.018812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019098 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a88e6363-c095-4450-98d2-18808abd10a9-audit-policies\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019653 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d787bca-06ac-4e8b-9797-6b25ebbcc706-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019781 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a3adfac4-b370-4f06-a19e-640252120515-tmpfs\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019813 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqzh\" (UniqueName: \"kubernetes.io/projected/392745ae-c47f-49a2-a1fc-33e160f86b8f-kube-api-access-rgqzh\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3c0a5144-8147-439e-a8ab-38fa28c0d96c-machine-approver-tls\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.019925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.020199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.020277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c0a5144-8147-439e-a8ab-38fa28c0d96c-auth-proxy-config\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.020353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.020749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.020924 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.021001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.021031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-encryption-config\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.021071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.021651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e5a800d-f198-4dbe-8c7b-a84e6c130041-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.022085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d787bca-06ac-4e8b-9797-6b25ebbcc706-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.022124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.023691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.024153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.024192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.024236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.024683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b4a6f2c5-27d9-42d3-8856-3517265316cf-metrics-tls\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.025436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3c0a5144-8147-439e-a8ab-38fa28c0d96c-machine-approver-tls\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.027145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.027976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88e6363-c095-4450-98d2-18808abd10a9-serving-cert\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.028449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.029248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.030059 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.031121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.040801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.061288 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.081274 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.100731 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743439e2-53c8-4cda-b960-2448c1fb2941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8gh\" (UniqueName: \"kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120566 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-config\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120601 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7km\" (UniqueName: \"kubernetes.io/projected/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-kube-api-access-nd7km\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec884c6-276b-4230-a41c-3375cbc2104b-config\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdvb\" (UniqueName: \"kubernetes.io/projected/bfec50d0-fe3d-45f4-afdb-cba2fc415878-kube-api-access-kfdvb\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3bc18a-ca66-44f2-9667-48dd85b638fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnfs\" (UniqueName: \"kubernetes.io/projected/35f420a5-ec2f-4d37-94ea-af000df33824-kube-api-access-lvnfs\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120762 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jlkp\" (UniqueName: \"kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-service-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f420a5-ec2f-4d37-94ea-af000df33824-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120893 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.120975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121019 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb424d82-17ec-4515-903c-96156334ca08-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnh67\" (UniqueName: \"kubernetes.io/projected/743439e2-53c8-4cda-b960-2448c1fb2941-kube-api-access-dnh67\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121079 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfec50d0-fe3d-45f4-afdb-cba2fc415878-serving-cert\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gzmp\" (UniqueName: \"kubernetes.io/projected/4029e155-0c45-49cb-a25b-ddb1f768a88f-kube-api-access-8gzmp\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5ld5\" (UniqueName: \"kubernetes.io/projected/a3adfac4-b370-4f06-a19e-640252120515-kube-api-access-t5ld5\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5l7p\" (UniqueName: \"kubernetes.io/projected/eb424d82-17ec-4515-903c-96156334ca08-kube-api-access-r5l7p\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121175 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqzh\" (UniqueName: \"kubernetes.io/projected/392745ae-c47f-49a2-a1fc-33e160f86b8f-kube-api-access-rgqzh\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121191 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a3adfac4-b370-4f06-a19e-640252120515-tmpfs\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121225 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fec884c6-276b-4230-a41c-3375cbc2104b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-config\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121424 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e14e3a43-8228-4129-a69e-0ecfa3a7114c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121582 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121605 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb424d82-17ec-4515-903c-96156334ca08-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl47j\" (UniqueName: \"kubernetes.io/projected/e14e3a43-8228-4129-a69e-0ecfa3a7114c-kube-api-access-nl47j\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743439e2-53c8-4cda-b960-2448c1fb2941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.121702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec884c6-276b-4230-a41c-3375cbc2104b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.122115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a3adfac4-b370-4f06-a19e-640252120515-tmpfs\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.122438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-service-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.122817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/743439e2-53c8-4cda-b960-2448c1fb2941-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.122842 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfec50d0-fe3d-45f4-afdb-cba2fc415878-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.123341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/743439e2-53c8-4cda-b960-2448c1fb2941-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.124862 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfec50d0-fe3d-45f4-afdb-cba2fc415878-serving-cert\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.140458 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.161112 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.181467 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.201304 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.221158 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.240996 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.261141 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.281375 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.307725 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.321129 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.342238 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.362405 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.382614 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.402505 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.421349 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.442337 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.463339 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.482195 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.501799 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.523659 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.542593 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.555871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.561621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.582164 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.602051 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.630460 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.642185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.662075 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.681931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.703277 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.721351 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.741549 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.760752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.782333 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.801529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.821508 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.826354 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fec884c6-276b-4230-a41c-3375cbc2104b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.841568 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.843749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fec884c6-276b-4230-a41c-3375cbc2104b-config\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.862016 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.862767 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb424d82-17ec-4515-903c-96156334ca08-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.879219 4790 request.go:700] Waited for 1.016240538s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.881042 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.887785 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb424d82-17ec-4515-903c-96156334ca08-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.901610 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.922385 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.941974 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 06 11:58:52 crc kubenswrapper[4790]: I0406 11:58:52.961047 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.002019 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.018246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e14e3a43-8228-4129-a69e-0ecfa3a7114c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.022408 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.041390 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.046722 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/35f420a5-ec2f-4d37-94ea-af000df33824-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.061454 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.081593 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.100805 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121820 4790 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121914 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121931 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume podName:e42a375c-23c0-471a-8a17-20a03aabc3d4 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.621904391 +0000 UTC m=+112.609647257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume") pod "collect-profiles-29591265-cs56k" (UID: "e42a375c-23c0-471a-8a17-20a03aabc3d4") : failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121943 4790 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121987 4790 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121994 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.121955 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token podName:392745ae-c47f-49a2-a1fc-33e160f86b8f nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.621942762 +0000 UTC m=+112.609685628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token") pod "machine-config-server-zh6m9" (UID: "392745ae-c47f-49a2-a1fc-33e160f86b8f") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122116 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config podName:4029e155-0c45-49cb-a25b-ddb1f768a88f nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622068245 +0000 UTC m=+112.609811281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config") pod "service-ca-operator-777779d784-hrvkd" (UID: "4029e155-0c45-49cb-a25b-ddb1f768a88f") : failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122154 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert podName:4029e155-0c45-49cb-a25b-ddb1f768a88f nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622136817 +0000 UTC m=+112.609879893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert") pod "service-ca-operator-777779d784-hrvkd" (UID: "4029e155-0c45-49cb-a25b-ddb1f768a88f") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122156 4790 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122184 4790 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.122170 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122245 4790 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122169 4790 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122186 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs podName:392745ae-c47f-49a2-a1fc-33e160f86b8f nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622169548 +0000 UTC m=+112.609912654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs") pod "machine-config-server-zh6m9" (UID: "392745ae-c47f-49a2-a1fc-33e160f86b8f") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122441 4790 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122463 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config podName:1d3bc18a-ca66-44f2-9667-48dd85b638fe nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622435434 +0000 UTC m=+112.610178450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config") pod "kube-apiserver-operator-766d6c64bb-jqzrz" (UID: "1d3bc18a-ca66-44f2-9667-48dd85b638fe") : failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122507 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key podName:5ee57c19-af8c-4e40-ae89-9cfb07ad24d7 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622487136 +0000 UTC m=+112.610230232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key") pod "service-ca-9c57cc56f-kp5kl" (UID: "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122544 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert podName:a3adfac4-b370-4f06-a19e-640252120515 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622525326 +0000 UTC m=+112.610268422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert") pod "packageserver-d55dfcdfc-gq5tc" (UID: "a3adfac4-b370-4f06-a19e-640252120515") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122546 4790 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122607 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert podName:47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622587728 +0000 UTC m=+112.610330794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-5d75s" (UID: "47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122618 4790 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122648 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert podName:a3adfac4-b370-4f06-a19e-640252120515 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622630149 +0000 UTC m=+112.610373235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert") pod "packageserver-d55dfcdfc-gq5tc" (UID: "a3adfac4-b370-4f06-a19e-640252120515") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122681 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle podName:5ee57c19-af8c-4e40-ae89-9cfb07ad24d7 nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.62266533 +0000 UTC m=+112.610408436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle") pod "service-ca-9c57cc56f-kp5kl" (UID: "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7") : failed to sync configmap cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: E0406 11:58:53.122725 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert podName:1d3bc18a-ca66-44f2-9667-48dd85b638fe nodeName:}" failed. No retries permitted until 2026-04-06 11:58:53.622710191 +0000 UTC m=+112.610453247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert") pod "kube-apiserver-operator-766d6c64bb-jqzrz" (UID: "1d3bc18a-ca66-44f2-9667-48dd85b638fe") : failed to sync secret cache: timed out waiting for the condition Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.140786 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.161301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.181208 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.201077 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.221946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.242234 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.262486 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.282799 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.301474 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.321613 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.341642 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.363069 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.382476 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.403304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.422246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.442945 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.463124 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.483458 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.502293 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.522063 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.541775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.562735 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.581779 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.602197 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.622005 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.652348 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654411 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.654496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.655469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.655763 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.655810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3bc18a-ca66-44f2-9667-48dd85b638fe-config\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.656278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4029e155-0c45-49cb-a25b-ddb1f768a88f-config\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.657590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3bc18a-ca66-44f2-9667-48dd85b638fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.658852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-apiservice-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.659193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.659727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.660156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3adfac4-b370-4f06-a19e-640252120515-webhook-cert\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.660685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4029e155-0c45-49cb-a25b-ddb1f768a88f-serving-cert\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.663936 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.683435 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.688773 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-node-bootstrap-token\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.701822 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.708208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/392745ae-c47f-49a2-a1fc-33e160f86b8f-certs\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.741212 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.761161 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.781873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.801368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.821622 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.840927 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.861580 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.879789 4790 request.go:700] Waited for 1.923868977s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.881554 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.902006 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.920681 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.940754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 06 11:58:53 crc kubenswrapper[4790]: I0406 11:58:53.976584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h6ch\" (UniqueName: \"kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch\") pod \"controller-manager-879f6c89f-x9f69\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.001598 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqhlm\" (UniqueName: \"kubernetes.io/projected/5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e-kube-api-access-tqhlm\") pod \"machine-api-operator-5694c8668f-fr5xt\" (UID: \"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.002716 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.021647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r477f\" (UniqueName: \"kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f\") pod \"apiserver-76f77b778f-sl7ql\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.039414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpct\" (UniqueName: \"kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct\") pod \"oauth-openshift-558db77b4-75kvr\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.042220 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.056558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44zq\" (UniqueName: \"kubernetes.io/projected/b4a6f2c5-27d9-42d3-8856-3517265316cf-kube-api-access-w44zq\") pod \"dns-operator-744455d44c-rm8wl\" (UID: \"b4a6f2c5-27d9-42d3-8856-3517265316cf\") " pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.065661 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.075970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chng2\" (UniqueName: \"kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2\") pod \"route-controller-manager-6576b87f9c-9t8th\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.098335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7x5\" (UniqueName: \"kubernetes.io/projected/3c0a5144-8147-439e-a8ab-38fa28c0d96c-kube-api-access-6p7x5\") pod \"machine-approver-56656f9798-69lmn\" (UID: \"3c0a5144-8147-439e-a8ab-38fa28c0d96c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.115585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv48j\" (UniqueName: \"kubernetes.io/projected/4d787bca-06ac-4e8b-9797-6b25ebbcc706-kube-api-access-qv48j\") pod \"openshift-apiserver-operator-796bbdcf4f-ct8gb\" (UID: \"4d787bca-06ac-4e8b-9797-6b25ebbcc706\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.137458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhtgf\" (UniqueName: \"kubernetes.io/projected/6e5a800d-f198-4dbe-8c7b-a84e6c130041-kube-api-access-mhtgf\") pod \"cluster-samples-operator-665b6dd947-fbq9z\" (UID: \"6e5a800d-f198-4dbe-8c7b-a84e6c130041\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.158409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lk6h\" (UniqueName: \"kubernetes.io/projected/a88e6363-c095-4450-98d2-18808abd10a9-kube-api-access-5lk6h\") pod \"apiserver-7bbb656c7d-n8gzk\" (UID: \"a88e6363-c095-4450-98d2-18808abd10a9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.179125 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44tx\" (UniqueName: \"kubernetes.io/projected/dc2f1a83-c329-47b9-98d3-08104ce2323c-kube-api-access-n44tx\") pod \"downloads-7954f5f757-ql4f9\" (UID: \"dc2f1a83-c329-47b9-98d3-08104ce2323c\") " pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.195158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8gh\" (UniqueName: \"kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh\") pod \"collect-profiles-29591265-cs56k\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.214532 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.224071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7km\" (UniqueName: \"kubernetes.io/projected/47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70-kube-api-access-nd7km\") pod \"package-server-manager-789f6589d5-5d75s\" (UID: \"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.245400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.253216 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdvb\" (UniqueName: \"kubernetes.io/projected/bfec50d0-fe3d-45f4-afdb-cba2fc415878-kube-api-access-kfdvb\") pod \"authentication-operator-69f744f599-742t2\" (UID: \"bfec50d0-fe3d-45f4-afdb-cba2fc415878\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.266133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.276478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5l7p\" (UniqueName: \"kubernetes.io/projected/eb424d82-17ec-4515-903c-96156334ca08-kube-api-access-r5l7p\") pod \"kube-storage-version-migrator-operator-b67b599dd-pwmq6\" (UID: \"eb424d82-17ec-4515-903c-96156334ca08\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.282915 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.290345 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fr5xt"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.297179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnh67\" (UniqueName: \"kubernetes.io/projected/743439e2-53c8-4cda-b960-2448c1fb2941-kube-api-access-dnh67\") pod \"openshift-controller-manager-operator-756b6f6bc6-jfjdg\" (UID: \"743439e2-53c8-4cda-b960-2448c1fb2941\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.297392 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.299678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5ld5\" (UniqueName: \"kubernetes.io/projected/a3adfac4-b370-4f06-a19e-640252120515-kube-api-access-t5ld5\") pod \"packageserver-d55dfcdfc-gq5tc\" (UID: \"a3adfac4-b370-4f06-a19e-640252120515\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.309672 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.312092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.316877 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.320247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqzh\" (UniqueName: \"kubernetes.io/projected/392745ae-c47f-49a2-a1fc-33e160f86b8f-kube-api-access-rgqzh\") pod \"machine-config-server-zh6m9\" (UID: \"392745ae-c47f-49a2-a1fc-33e160f86b8f\") " pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.322671 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.332093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zh6m9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.335379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3bc18a-ca66-44f2-9667-48dd85b638fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jqzrz\" (UID: \"1d3bc18a-ca66-44f2-9667-48dd85b638fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.339353 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.353380 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rm8wl"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.355476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.359440 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnfs\" (UniqueName: \"kubernetes.io/projected/35f420a5-ec2f-4d37-94ea-af000df33824-kube-api-access-lvnfs\") pod \"control-plane-machine-set-operator-78cbb6b69f-x74dz\" (UID: \"35f420a5-ec2f-4d37-94ea-af000df33824\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.381572 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl47j\" (UniqueName: \"kubernetes.io/projected/e14e3a43-8228-4129-a69e-0ecfa3a7114c-kube-api-access-nl47j\") pod \"multus-admission-controller-857f4d67dd-ggnlt\" (UID: \"e14e3a43-8228-4129-a69e-0ecfa3a7114c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:54 crc kubenswrapper[4790]: W0406 11:58:54.382410 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4610d751_50bd_42d4_a947_1c494bcb4096.slice/crio-320302ab91ca339776f071b8b1d515565ff938e4f78e807de094324803d6b283 WatchSource:0}: Error finding container 320302ab91ca339776f071b8b1d515565ff938e4f78e807de094324803d6b283: Status 404 returned error can't find the container with id 320302ab91ca339776f071b8b1d515565ff938e4f78e807de094324803d6b283 Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.395215 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.395550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fec884c6-276b-4230-a41c-3375cbc2104b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnq82\" (UID: \"fec884c6-276b-4230-a41c-3375cbc2104b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.417024 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.422292 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jlkp\" (UniqueName: \"kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp\") pod \"service-ca-9c57cc56f-kp5kl\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.444419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gzmp\" (UniqueName: \"kubernetes.io/projected/4029e155-0c45-49cb-a25b-ddb1f768a88f-kube-api-access-8gzmp\") pod \"service-ca-operator-777779d784-hrvkd\" (UID: \"4029e155-0c45-49cb-a25b-ddb1f768a88f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477227 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njpf\" (UniqueName: \"kubernetes.io/projected/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-kube-api-access-5njpf\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-metrics-certs\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhzn\" (UniqueName: \"kubernetes.io/projected/470e540b-4e4b-4064-a729-b5e608f1b394-kube-api-access-9dhzn\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b6c6217-8764-459d-b2a8-c99272221fb1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477414 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-config\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477473 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b20da9-8cdd-4614-9c0a-9db7287856cd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477496 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-srv-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477532 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477553 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b20da9-8cdd-4614-9c0a-9db7287856cd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-stats-auth\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477596 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-serving-cert\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477667 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64798e8b-0f1d-48f9-ab27-369e6953a5cb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477723 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxpq\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477765 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-trusted-ca\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-service-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89fs4\" (UniqueName: \"kubernetes.io/projected/f0566848-e645-46bd-8e5e-fddcde1248ba-kube-api-access-89fs4\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5v6k\" (UniqueName: \"kubernetes.io/projected/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-kube-api-access-b5v6k\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48539430-665f-4976-8ab6-7f06a26b9bde-service-ca-bundle\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477973 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.477997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld88v\" (UniqueName: \"kubernetes.io/projected/0360c312-3ecf-42c9-9af9-470c231eefbd-kube-api-access-ld88v\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dptg\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-kube-api-access-2dptg\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs569\" (UniqueName: \"kubernetes.io/projected/3b6c6217-8764-459d-b2a8-c99272221fb1-kube-api-access-zs569\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqkr5\" (UniqueName: \"kubernetes.io/projected/48539430-665f-4976-8ab6-7f06a26b9bde-kube-api-access-jqkr5\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-config\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478185 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7z4x\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-kube-api-access-v7z4x\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.478209 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.480384 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.480471 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwhch\" (UniqueName: \"kubernetes.io/projected/3851471a-4968-40c6-9be1-9ba072ddf741-kube-api-access-qwhch\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.480525 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.480567 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.481739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.481784 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.481816 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-srv-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.481869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-profile-collector-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.481886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470e540b-4e4b-4064-a729-b5e608f1b394-proxy-tls\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.483471 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-images\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.483515 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.483587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64798e8b-0f1d-48f9-ab27-369e6953a5cb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.483630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-client\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.483649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b20da9-8cdd-4614-9c0a-9db7287856cd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.483892 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:54.983881171 +0000 UTC m=+113.971624037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.484758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0360c312-3ecf-42c9-9af9-470c231eefbd-serving-cert\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.485519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0566848-e645-46bd-8e5e-fddcde1248ba-serving-cert\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.485584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lnc\" (UniqueName: \"kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.485624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-default-certificate\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.485883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0360c312-3ecf-42c9-9af9-470c231eefbd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.485914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75988\" (UniqueName: \"kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487480 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drbtr\" (UniqueName: \"kubernetes.io/projected/dbfe58a1-c1ad-4719-9676-e6dbaca5c530-kube-api-access-drbtr\") pod \"migrator-59844c95c7-rglnr\" (UID: \"dbfe58a1-c1ad-4719-9676-e6dbaca5c530\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.487980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b6c6217-8764-459d-b2a8-c99272221fb1-proxy-tls\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.518148 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.525771 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" Apr 06 11:58:54 crc kubenswrapper[4790]: W0406 11:58:54.532991 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0967d5bd_4fa1_4e9c_a58a_1cd171f56b60.slice/crio-3b2c84778aecb70b5372eef1fd0a7e71e7c660c0d6c70e363d69aae11d82033b WatchSource:0}: Error finding container 3b2c84778aecb70b5372eef1fd0a7e71e7c660c0d6c70e363d69aae11d82033b: Status 404 returned error can't find the container with id 3b2c84778aecb70b5372eef1fd0a7e71e7c660c0d6c70e363d69aae11d82033b Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.545983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.554465 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.569654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.576613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.589136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590027 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0566848-e645-46bd-8e5e-fddcde1248ba-serving-cert\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-csi-data-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88lnc\" (UniqueName: \"kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590109 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-default-certificate\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0360c312-3ecf-42c9-9af9-470c231eefbd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590177 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75988\" (UniqueName: \"kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590195 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drbtr\" (UniqueName: \"kubernetes.io/projected/dbfe58a1-c1ad-4719-9676-e6dbaca5c530-kube-api-access-drbtr\") pod \"migrator-59844c95c7-rglnr\" (UID: \"dbfe58a1-c1ad-4719-9676-e6dbaca5c530\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b6c6217-8764-459d-b2a8-c99272221fb1-proxy-tls\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njpf\" (UniqueName: \"kubernetes.io/projected/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-kube-api-access-5njpf\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590652 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590683 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-metrics-certs\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhzn\" (UniqueName: \"kubernetes.io/projected/470e540b-4e4b-4064-a729-b5e608f1b394-kube-api-access-9dhzn\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b6c6217-8764-459d-b2a8-c99272221fb1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflnl\" (UniqueName: \"kubernetes.io/projected/418e1d4b-ebc4-48ff-89f1-e52375374d63-kube-api-access-dflnl\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-config\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418e1d4b-ebc4-48ff-89f1-e52375374d63-cert\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.590964 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7fh\" (UniqueName: \"kubernetes.io/projected/b8d6da06-7170-416d-8a93-b3f9fa890fb1-kube-api-access-fv7fh\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b20da9-8cdd-4614-9c0a-9db7287856cd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-srv-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b20da9-8cdd-4614-9c0a-9db7287856cd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-stats-auth\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-serving-cert\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtkm\" (UniqueName: \"kubernetes.io/projected/763471f0-0557-46ae-9eaf-e4bfd2b737ad-kube-api-access-dxtkm\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64798e8b-0f1d-48f9-ab27-369e6953a5cb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591244 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxpq\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0360c312-3ecf-42c9-9af9-470c231eefbd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.591495 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.592009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-config\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.592995 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-trusted-ca\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89fs4\" (UniqueName: \"kubernetes.io/projected/f0566848-e645-46bd-8e5e-fddcde1248ba-kube-api-access-89fs4\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-service-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-mountpoint-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5v6k\" (UniqueName: \"kubernetes.io/projected/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-kube-api-access-b5v6k\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593903 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-default-certificate\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.593611 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.594294 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.094279437 +0000 UTC m=+114.082022303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.594797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-service-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b20da9-8cdd-4614-9c0a-9db7287856cd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595576 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48539430-665f-4976-8ab6-7f06a26b9bde-service-ca-bundle\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-registration-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595698 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dptg\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-kube-api-access-2dptg\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs569\" (UniqueName: \"kubernetes.io/projected/3b6c6217-8764-459d-b2a8-c99272221fb1-kube-api-access-zs569\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld88v\" (UniqueName: \"kubernetes.io/projected/0360c312-3ecf-42c9-9af9-470c231eefbd-kube-api-access-ld88v\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.595888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.607330 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-serving-cert\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.607366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.607882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.608537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b6c6217-8764-459d-b2a8-c99272221fb1-proxy-tls\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.608548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b6c6217-8764-459d-b2a8-c99272221fb1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqkr5\" (UniqueName: \"kubernetes.io/projected/48539430-665f-4976-8ab6-7f06a26b9bde-kube-api-access-jqkr5\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-config\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cpp\" (UniqueName: \"kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.609494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7z4x\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-kube-api-access-v7z4x\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b20da9-8cdd-4614-9c0a-9db7287856cd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-stats-auth\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.610873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-socket-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwhch\" (UniqueName: \"kubernetes.io/projected/3851471a-4968-40c6-9be1-9ba072ddf741-kube-api-access-qwhch\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611085 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/763471f0-0557-46ae-9eaf-e4bfd2b737ad-metrics-tls\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-config\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611486 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.611585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612136 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-srv-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-profile-collector-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612242 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470e540b-4e4b-4064-a729-b5e608f1b394-proxy-tls\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612282 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-plugins-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-images\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64798e8b-0f1d-48f9-ab27-369e6953a5cb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-client\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b20da9-8cdd-4614-9c0a-9db7287856cd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612642 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64798e8b-0f1d-48f9-ab27-369e6953a5cb-trusted-ca\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.612977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48539430-665f-4976-8ab6-7f06a26b9bde-service-ca-bundle\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.613400 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.613898 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-ca\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.615147 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.615275 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0566848-e645-46bd-8e5e-fddcde1248ba-trusted-ca\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.616577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0566848-e645-46bd-8e5e-fddcde1248ba-serving-cert\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.619104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.620634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.623430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0360c312-3ecf-42c9-9af9-470c231eefbd-serving-cert\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.623443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.623683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-srv-cert\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.625119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.625612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.625672 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/763471f0-0557-46ae-9eaf-e4bfd2b737ad-config-volume\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.625866 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.626160 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.126132103 +0000 UTC m=+114.113874969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.627028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/470e540b-4e4b-4064-a729-b5e608f1b394-images\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.630265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48539430-665f-4976-8ab6-7f06a26b9bde-metrics-certs\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.630579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.634848 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.644524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0360c312-3ecf-42c9-9af9-470c231eefbd-serving-cert\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.645261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/470e540b-4e4b-4064-a729-b5e608f1b394-proxy-tls\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.645707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-srv-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.646256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3851471a-4968-40c6-9be1-9ba072ddf741-etcd-client\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.652546 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64798e8b-0f1d-48f9-ab27-369e6953a5cb-metrics-tls\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.653005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njpf\" (UniqueName: \"kubernetes.io/projected/4ff4177e-05ad-4a06-bf17-fa05ab567c5d-kube-api-access-5njpf\") pod \"olm-operator-6b444d44fb-trfn8\" (UID: \"4ff4177e-05ad-4a06-bf17-fa05ab567c5d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.653909 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-profile-collector-cert\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.659722 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lnc\" (UniqueName: \"kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc\") pod \"console-f9d7485db-dkg9c\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.679962 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.680663 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89fs4\" (UniqueName: \"kubernetes.io/projected/f0566848-e645-46bd-8e5e-fddcde1248ba-kube-api-access-89fs4\") pod \"console-operator-58897d9998-vh9v9\" (UID: \"f0566848-e645-46bd-8e5e-fddcde1248ba\") " pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.700457 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5v6k\" (UniqueName: \"kubernetes.io/projected/f44bc876-aaf0-4ef8-aec0-e7aed034f67f-kube-api-access-b5v6k\") pod \"catalog-operator-68c6474976-2mbnf\" (UID: \"f44bc876-aaf0-4ef8-aec0-e7aed034f67f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.728307 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.728585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-mountpoint-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.728616 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-registration-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.728676 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cpp\" (UniqueName: \"kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-socket-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/763471f0-0557-46ae-9eaf-e4bfd2b737ad-metrics-tls\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.729155 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.229130414 +0000 UTC m=+114.216873280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-plugins-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/763471f0-0557-46ae-9eaf-e4bfd2b737ad-config-volume\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-csi-data-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflnl\" (UniqueName: \"kubernetes.io/projected/418e1d4b-ebc4-48ff-89f1-e52375374d63-kube-api-access-dflnl\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.729706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418e1d4b-ebc4-48ff-89f1-e52375374d63-cert\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7fh\" (UniqueName: \"kubernetes.io/projected/b8d6da06-7170-416d-8a93-b3f9fa890fb1-kube-api-access-fv7fh\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxtkm\" (UniqueName: \"kubernetes.io/projected/763471f0-0557-46ae-9eaf-e4bfd2b737ad-kube-api-access-dxtkm\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730221 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.730517 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.230502398 +0000 UTC m=+114.218245264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730578 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-mountpoint-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.730756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-registration-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.731026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-socket-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.731075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/763471f0-0557-46ae-9eaf-e4bfd2b737ad-config-volume\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.731399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-plugins-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.731630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.732454 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.732512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b8d6da06-7170-416d-8a93-b3f9fa890fb1-csi-data-dir\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.731817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.734950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.735369 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ql4f9"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.737892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/763471f0-0557-46ae-9eaf-e4bfd2b737ad-metrics-tls\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.738646 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/418e1d4b-ebc4-48ff-89f1-e52375374d63-cert\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.757694 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75988\" (UniqueName: \"kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988\") pod \"marketplace-operator-79b997595-gzwvz\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.758575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxpq\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.765509 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.782537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhzn\" (UniqueName: \"kubernetes.io/projected/470e540b-4e4b-4064-a729-b5e608f1b394-kube-api-access-9dhzn\") pod \"machine-config-operator-74547568cd-sgt9w\" (UID: \"470e540b-4e4b-4064-a729-b5e608f1b394\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.788924 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.795757 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.796107 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.804533 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.807437 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.815846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dptg\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-kube-api-access-2dptg\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.819597 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.821334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs569\" (UniqueName: \"kubernetes.io/projected/3b6c6217-8764-459d-b2a8-c99272221fb1-kube-api-access-zs569\") pod \"machine-config-controller-84d6567774-hg5f8\" (UID: \"3b6c6217-8764-459d-b2a8-c99272221fb1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.831878 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.832051 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.332026332 +0000 UTC m=+114.319769198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.832104 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.832433 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.332426202 +0000 UTC m=+114.320169068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.844740 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld88v\" (UniqueName: \"kubernetes.io/projected/0360c312-3ecf-42c9-9af9-470c231eefbd-kube-api-access-ld88v\") pod \"openshift-config-operator-7777fb866f-prb62\" (UID: \"0360c312-3ecf-42c9-9af9-470c231eefbd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.885112 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-742t2"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.885366 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.894660 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.903419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84f1a57a-9a68-4f5a-a0df-3c2bda96d02f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r5lgx\" (UID: \"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.904662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqkr5\" (UniqueName: \"kubernetes.io/projected/48539430-665f-4976-8ab6-7f06a26b9bde-kube-api-access-jqkr5\") pod \"router-default-5444994796-895qp\" (UID: \"48539430-665f-4976-8ab6-7f06a26b9bde\") " pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.904778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.914180 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s"] Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.914743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7z4x\" (UniqueName: \"kubernetes.io/projected/64798e8b-0f1d-48f9-ab27-369e6953a5cb-kube-api-access-v7z4x\") pod \"ingress-operator-5b745b69d9-f78dq\" (UID: \"64798e8b-0f1d-48f9-ab27-369e6953a5cb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.922790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.924482 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.932794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:54 crc kubenswrapper[4790]: E0406 11:58:54.933371 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.433355152 +0000 UTC m=+114.421098018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.934188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwhch\" (UniqueName: \"kubernetes.io/projected/3851471a-4968-40c6-9be1-9ba072ddf741-kube-api-access-qwhch\") pod \"etcd-operator-b45778765-499ss\" (UID: \"3851471a-4968-40c6-9be1-9ba072ddf741\") " pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.957338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7b20da9-8cdd-4614-9c0a-9db7287856cd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\" (UID: \"f7b20da9-8cdd-4614-9c0a-9db7287856cd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:54 crc kubenswrapper[4790]: W0406 11:58:54.972301 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfec50d0_fe3d_45f4_afdb_cba2fc415878.slice/crio-62d42f462c53b266747f2919600c70c48599c742eb193dbf5e5e4d60e6431358 WatchSource:0}: Error finding container 62d42f462c53b266747f2919600c70c48599c742eb193dbf5e5e4d60e6431358: Status 404 returned error can't find the container with id 62d42f462c53b266747f2919600c70c48599c742eb193dbf5e5e4d60e6431358 Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.972717 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:54 crc kubenswrapper[4790]: W0406 11:58:54.973191 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d787bca_06ac_4e8b_9797_6b25ebbcc706.slice/crio-c9d79470a31d0d57f623954b554b8a0221bbd507376cc5f6e893dd897421b79b WatchSource:0}: Error finding container c9d79470a31d0d57f623954b554b8a0221bbd507376cc5f6e893dd897421b79b: Status 404 returned error can't find the container with id c9d79470a31d0d57f623954b554b8a0221bbd507376cc5f6e893dd897421b79b Apr 06 11:58:54 crc kubenswrapper[4790]: I0406 11:58:54.976733 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drbtr\" (UniqueName: \"kubernetes.io/projected/dbfe58a1-c1ad-4719-9676-e6dbaca5c530-kube-api-access-drbtr\") pod \"migrator-59844c95c7-rglnr\" (UID: \"dbfe58a1-c1ad-4719-9676-e6dbaca5c530\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.002979 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.022943 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxtkm\" (UniqueName: \"kubernetes.io/projected/763471f0-0557-46ae-9eaf-e4bfd2b737ad-kube-api-access-dxtkm\") pod \"dns-default-fpjz5\" (UID: \"763471f0-0557-46ae-9eaf-e4bfd2b737ad\") " pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.034121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.034492 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.534473846 +0000 UTC m=+114.522216782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.038422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cpp\" (UniqueName: \"kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp\") pod \"cni-sysctl-allowlist-ds-lklpx\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.056099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.061192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7fh\" (UniqueName: \"kubernetes.io/projected/b8d6da06-7170-416d-8a93-b3f9fa890fb1-kube-api-access-fv7fh\") pod \"csi-hostpathplugin-gz7ps\" (UID: \"b8d6da06-7170-416d-8a93-b3f9fa890fb1\") " pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.065602 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.071217 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.075489 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflnl\" (UniqueName: \"kubernetes.io/projected/418e1d4b-ebc4-48ff-89f1-e52375374d63-kube-api-access-dflnl\") pod \"ingress-canary-9ff6v\" (UID: \"418e1d4b-ebc4-48ff-89f1-e52375374d63\") " pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.087999 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.117133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.134869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.135391 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.635370045 +0000 UTC m=+114.623112911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.166636 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.229948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" event={"ID":"bfec50d0-fe3d-45f4-afdb-cba2fc415878","Type":"ContainerStarted","Data":"62d42f462c53b266747f2919600c70c48599c742eb193dbf5e5e4d60e6431358"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.231683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" event={"ID":"a88e6363-c095-4450-98d2-18808abd10a9","Type":"ContainerStarted","Data":"49d98eb19647a73bb92fec830285eabd6df15d4d467087dc1542a8f7527c9541"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.236735 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.239055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" event={"ID":"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60","Type":"ContainerStarted","Data":"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.239100 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" event={"ID":"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60","Type":"ContainerStarted","Data":"3b2c84778aecb70b5372eef1fd0a7e71e7c660c0d6c70e363d69aae11d82033b"} Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.239070 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.739055534 +0000 UTC m=+114.726798390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.239569 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.239883 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.247783 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9ff6v" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.248235 4790 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x9f69 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.248268 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.256811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerStarted","Data":"c9d79470a31d0d57f623954b554b8a0221bbd507376cc5f6e893dd897421b79b"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.261403 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" event={"ID":"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e","Type":"ContainerStarted","Data":"f25542d9f2b7daf889f86ce61072bc40c3f8dcfc58f4fbd8cc4c3e47615fa74e"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.261457 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" event={"ID":"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e","Type":"ContainerStarted","Data":"41aa78a304860e20bf279274b4124ebda9f3583b3b200928f612777c124bffd4"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.261467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" event={"ID":"5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e","Type":"ContainerStarted","Data":"85c395ab5aa565fe40a351f08021c8b3c5d9d955368fd6d088c33eed61f8c2b7"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.267898 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" event={"ID":"b4a6f2c5-27d9-42d3-8856-3517265316cf","Type":"ContainerStarted","Data":"6cf5629cf2c31843e5b16c668bc367dde679eda9a497c67ecbb689e77cf83df8"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.267939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" event={"ID":"b4a6f2c5-27d9-42d3-8856-3517265316cf","Type":"ContainerStarted","Data":"6efa925bca0687d5c6ebf5960eec83ae68eb9de6a28e3e2acbf53d31f14b9c68"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.269570 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.273066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerStarted","Data":"d6ac100f727ad1d086c152129f22fe1e624d1d83773d76a2ad9aefe952bb9776"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.278281 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpjz5" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.294093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" event={"ID":"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70","Type":"ContainerStarted","Data":"6447225af5760236c1d3ed37816250c8d21c109f8ba75369c5e8cb85544aa72d"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.317213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zh6m9" event={"ID":"392745ae-c47f-49a2-a1fc-33e160f86b8f","Type":"ContainerStarted","Data":"2aced024e3f58424c1b4a89e26e1ec85009f0858d06c737414f5e7f5f83e22ad"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.317545 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zh6m9" event={"ID":"392745ae-c47f-49a2-a1fc-33e160f86b8f","Type":"ContainerStarted","Data":"fcd7f111835f8730d33362976120b1f963fc8fb9f3f69703325487bba6eed792"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.321476 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ql4f9" event={"ID":"dc2f1a83-c329-47b9-98d3-08104ce2323c","Type":"ContainerStarted","Data":"45b9d7c9cc45e604bd2351af713621a3b58068230d36de448d0c51fa05dd3fb0"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.339357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.339939 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.839894781 +0000 UTC m=+114.827637647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.340245 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.341200 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.841192353 +0000 UTC m=+114.828935219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.343565 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" event={"ID":"4610d751-50bd-42d4-a947-1c494bcb4096","Type":"ContainerStarted","Data":"331d5142c829ac03b3d4ad11b8d556154b95035abe55739872c38fc22dba0c53"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.343625 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" event={"ID":"4610d751-50bd-42d4-a947-1c494bcb4096","Type":"ContainerStarted","Data":"320302ab91ca339776f071b8b1d515565ff938e4f78e807de094324803d6b283"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.345134 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.347046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerStarted","Data":"a75482bd7d041ff41ef3239e83873f0fa38bb9693146d23461f1092c8b29a530"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.350637 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-75kvr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.350691 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.354388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" event={"ID":"e42a375c-23c0-471a-8a17-20a03aabc3d4","Type":"ContainerStarted","Data":"9a8fc9c5deb21a6a25d87a8216ea9c1683200c00e3a703891b57a07f449357bc"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.365036 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" event={"ID":"3c0a5144-8147-439e-a8ab-38fa28c0d96c","Type":"ContainerStarted","Data":"6eb95479f96c69c36af33fd7a3545666bc439e05068b6738a61f299545090687"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.365079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" event={"ID":"3c0a5144-8147-439e-a8ab-38fa28c0d96c","Type":"ContainerStarted","Data":"78fb9e82d522c6c27cee556c934b6c9602eaec2703412f052b0596e893352c6b"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.373234 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.375308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" event={"ID":"53544b57-db90-4281-a2ee-ba4ceceb1605","Type":"ContainerStarted","Data":"4481bf919e5353bb48bbcd4bd929dfec2012c92a49c6af2c5ebad00cf5177874"} Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.398072 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.406935 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.443495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.446699 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:55.946682617 +0000 UTC m=+114.934425483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.446806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd"] Apr 06 11:58:55 crc kubenswrapper[4790]: W0406 11:58:55.459994 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb424d82_17ec_4515_903c_96156334ca08.slice/crio-a1c968b9599cb42680bd83cd894d4fb2eda912d99cf19ac66d29031e15646e40 WatchSource:0}: Error finding container a1c968b9599cb42680bd83cd894d4fb2eda912d99cf19ac66d29031e15646e40: Status 404 returned error can't find the container with id a1c968b9599cb42680bd83cd894d4fb2eda912d99cf19ac66d29031e15646e40 Apr 06 11:58:55 crc kubenswrapper[4790]: W0406 11:58:55.467845 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3adfac4_b370_4f06_a19e_640252120515.slice/crio-d02a66ab5bbec812e6f2694890152d7144704231d140953efc16e440461b36b9 WatchSource:0}: Error finding container d02a66ab5bbec812e6f2694890152d7144704231d140953efc16e440461b36b9: Status 404 returned error can't find the container with id d02a66ab5bbec812e6f2694890152d7144704231d140953efc16e440461b36b9 Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.546374 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.546810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.551872 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.050404586 +0000 UTC m=+115.038147452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.570356 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/934f4d5f-3670-40da-b496-8b9f9f25fc0b-metrics-certs\") pod \"network-metrics-daemon-qkf8s\" (UID: \"934f4d5f-3670-40da-b496-8b9f9f25fc0b\") " pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.570922 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dkg9c"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.571843 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.592103 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.656798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qkf8s" Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.657226 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.657572 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.15755753 +0000 UTC m=+115.145300396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.672767 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.741680 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ggnlt"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.746220 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.760782 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.762419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.762863 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.262850339 +0000 UTC m=+115.250593205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.773239 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.779936 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-prb62"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.785141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.841603 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vh9v9"] Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.863848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.864613 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.364586558 +0000 UTC m=+115.352329424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:55 crc kubenswrapper[4790]: I0406 11:58:55.969389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:55 crc kubenswrapper[4790]: E0406 11:58:55.969808 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.469788715 +0000 UTC m=+115.457531581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.072269 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.073656 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.573635137 +0000 UTC m=+115.561378003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.139110 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gz7ps"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.176441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.177184 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.677157841 +0000 UTC m=+115.664900707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.178999 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.278740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.279254 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.779183118 +0000 UTC m=+115.766925994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.280534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.281428 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.781415624 +0000 UTC m=+115.769158480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.366975 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9ff6v"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.381347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.381944 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:56.881923733 +0000 UTC m=+115.869666599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.430121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpjz5"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.454931 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-499ss"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.455817 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.505240 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.506787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.507395 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.007383445 +0000 UTC m=+115.995126311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.550217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" event={"ID":"1d3bc18a-ca66-44f2-9667-48dd85b638fe","Type":"ContainerStarted","Data":"a6a9499a01d0367ed38e9e16a47cf009c6e22e11dc422d589c784b2b0e0bcfe5"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.554207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" event={"ID":"0360c312-3ecf-42c9-9af9-470c231eefbd","Type":"ContainerStarted","Data":"cff352a57e5aa108d7a0e68c64f2f14347b9adb3f81515a1daeec96ed205e9ef"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.588322 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.588376 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.596586 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" event={"ID":"bfec50d0-fe3d-45f4-afdb-cba2fc415878","Type":"ContainerStarted","Data":"1a02312babffbb9df18a9b40a384fba8d3e07f09e1e96ba9026d32410ce19dc8"} Apr 06 11:58:56 crc kubenswrapper[4790]: W0406 11:58:56.604103 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763471f0_0557_46ae_9eaf_e4bfd2b737ad.slice/crio-6520c7e2551ccbc3e6534f0ab0d05b6beccaf6edc36376a272f7e8e2fd4a0b43 WatchSource:0}: Error finding container 6520c7e2551ccbc3e6534f0ab0d05b6beccaf6edc36376a272f7e8e2fd4a0b43: Status 404 returned error can't find the container with id 6520c7e2551ccbc3e6534f0ab0d05b6beccaf6edc36376a272f7e8e2fd4a0b43 Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.606039 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerStarted","Data":"f6b6519b3d576b511eb396de9952044242ecd524164b0455623a7bc0b6f3a576"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.607243 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qkf8s"] Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.607273 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zh6m9" podStartSLOduration=5.607262758 podStartE2EDuration="5.607262758s" podCreationTimestamp="2026-04-06 11:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.542249945 +0000 UTC m=+115.529992811" watchObservedRunningTime="2026-04-06 11:58:56.607262758 +0000 UTC m=+115.595005624" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.607771 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.608142 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.1081291 +0000 UTC m=+116.095871966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.612344 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" podStartSLOduration=80.612326005 podStartE2EDuration="1m20.612326005s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.609016802 +0000 UTC m=+115.596759678" watchObservedRunningTime="2026-04-06 11:58:56.612326005 +0000 UTC m=+115.600068871" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.615109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerStarted","Data":"8f882f6e5adb97e3cbbbe83b4591652f98ebe1446d7621c5f10e6961b0596470"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.626638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" event={"ID":"f44bc876-aaf0-4ef8-aec0-e7aed034f67f","Type":"ContainerStarted","Data":"bd4fe5665779b3f5562419b144ef62183fdb7ff781225cb7b5fc1a48ab855306"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.635583 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" event={"ID":"e14e3a43-8228-4129-a69e-0ecfa3a7114c","Type":"ContainerStarted","Data":"5c017250a30c3fc447745c55985591efae0a48f69cf5f919cae30714c5700040"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.648473 4790 generic.go:334] "Generic (PLEG): container finished" podID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerID="8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa" exitCode=0 Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.648564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerDied","Data":"8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.658952 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fr5xt" podStartSLOduration=80.658932428 podStartE2EDuration="1m20.658932428s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.658475997 +0000 UTC m=+115.646218863" watchObservedRunningTime="2026-04-06 11:58:56.658932428 +0000 UTC m=+115.646675294" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.659601 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" podStartSLOduration=80.659595535 podStartE2EDuration="1m20.659595535s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.63255833 +0000 UTC m=+115.620301196" watchObservedRunningTime="2026-04-06 11:58:56.659595535 +0000 UTC m=+115.647338401" Apr 06 11:58:56 crc kubenswrapper[4790]: W0406 11:58:56.680161 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f1a57a_9a68_4f5a_a0df_3c2bda96d02f.slice/crio-1b3a31a47eee24c65f278e620196ca9469f11bee0b0858c4e49deff69c8a28b3 WatchSource:0}: Error finding container 1b3a31a47eee24c65f278e620196ca9469f11bee0b0858c4e49deff69c8a28b3: Status 404 returned error can't find the container with id 1b3a31a47eee24c65f278e620196ca9469f11bee0b0858c4e49deff69c8a28b3 Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.709680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.711524 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.211505261 +0000 UTC m=+116.199248117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.734656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" event={"ID":"6e5a800d-f198-4dbe-8c7b-a84e6c130041","Type":"ContainerStarted","Data":"021ccd9cef3460f403934bc81099fd3d5f5ddd2d4a754e3ae24e44c78818559c"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.734704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" event={"ID":"6e5a800d-f198-4dbe-8c7b-a84e6c130041","Type":"ContainerStarted","Data":"22af0b9f199e1dffbdf80b9a34a5059c5c55578551447c69449e8ccf8e3651b8"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.759588 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" podStartSLOduration=80.758571156 podStartE2EDuration="1m20.758571156s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.75234688 +0000 UTC m=+115.740089746" watchObservedRunningTime="2026-04-06 11:58:56.758571156 +0000 UTC m=+115.746314022" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.765521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" event={"ID":"4ff4177e-05ad-4a06-bf17-fa05ab567c5d","Type":"ContainerStarted","Data":"b769f04266b6b0de680f9e7fd3c91a9f55fa35eb4f259e6062b7172276268934"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.787892 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" podStartSLOduration=80.787767855 podStartE2EDuration="1m20.787767855s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.786324859 +0000 UTC m=+115.774067725" watchObservedRunningTime="2026-04-06 11:58:56.787767855 +0000 UTC m=+115.775510721" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.797507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" event={"ID":"35f420a5-ec2f-4d37-94ea-af000df33824","Type":"ContainerStarted","Data":"ca2a0e087ece1b0b231800471247b3202b19abda1af5b1228ea9abc06f37fbe1"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.811629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.812187 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.312162243 +0000 UTC m=+116.299905109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.812884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.816060 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.316050071 +0000 UTC m=+116.303792937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.855750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" event={"ID":"3c0a5144-8147-439e-a8ab-38fa28c0d96c","Type":"ContainerStarted","Data":"91878da35d9c0591b584f5365cf348f4d4bd1df1e3932b74574847cd083756db"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.901051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" event={"ID":"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70","Type":"ContainerStarted","Data":"ae8925ba11840832d515298fa8d975f3cf33331ab138a58284343bb2890e501f"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.902126 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.913795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerStarted","Data":"592ebd557120949c2f5a37455001497e7b579a32a7bfcce6df3f88469210b4c7"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.915129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:56 crc kubenswrapper[4790]: E0406 11:58:56.915761 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.415727299 +0000 UTC m=+116.403470165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.924027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dkg9c" event={"ID":"18df9b4d-88b9-46d0-adf8-90072301374e","Type":"ContainerStarted","Data":"11b5938ae906f39a81f798060020ac636b9cb07526c4328d7f11f2eff7143e30"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.924143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dkg9c" event={"ID":"18df9b4d-88b9-46d0-adf8-90072301374e","Type":"ContainerStarted","Data":"8ed42ba749ae266af6dfa9c8f417d83551d52c2746fe68c07a9503f6d99caa91"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.927389 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" event={"ID":"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7","Type":"ContainerStarted","Data":"e7b1b8d395f94018f9b8fc8c41b587923e6d8fbbe150c8dc432128012d3c0793"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.937581 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" event={"ID":"470e540b-4e4b-4064-a729-b5e608f1b394","Type":"ContainerStarted","Data":"a1c191839ca5f3424e1da45b9671e7bfcef6ece541b3704c40619e5ffe04e8ae"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.938492 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" podStartSLOduration=79.938475707 podStartE2EDuration="1m19.938475707s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.936384105 +0000 UTC m=+115.924126971" watchObservedRunningTime="2026-04-06 11:58:56.938475707 +0000 UTC m=+115.926218573" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.938636 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-69lmn" podStartSLOduration=80.938630441 podStartE2EDuration="1m20.938630441s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:56.887590677 +0000 UTC m=+115.875333563" watchObservedRunningTime="2026-04-06 11:58:56.938630441 +0000 UTC m=+115.926373307" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.939037 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" event={"ID":"53544b57-db90-4281-a2ee-ba4ceceb1605","Type":"ContainerStarted","Data":"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.940178 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.956210 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.964425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" event={"ID":"e1aab57b-c8ed-492c-91d3-e190449713a0","Type":"ContainerStarted","Data":"3d2db21fe60acc0e96bf589a8acccac243748b164ea844ea52f96c04f04d7191"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.966130 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.979317 4790 generic.go:334] "Generic (PLEG): container finished" podID="a88e6363-c095-4450-98d2-18808abd10a9" containerID="a1ab489738c5a7edef40b31259c01fa72d28831bd41f53ce1250b421d6e44cb9" exitCode=0 Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.979420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" event={"ID":"a88e6363-c095-4450-98d2-18808abd10a9","Type":"ContainerDied","Data":"a1ab489738c5a7edef40b31259c01fa72d28831bd41f53ce1250b421d6e44cb9"} Apr 06 11:58:56 crc kubenswrapper[4790]: I0406 11:58:56.988543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerStarted","Data":"841a2ad76497a97c141baf4840b01f491b4479a22b7020d4685f5da3d640f0d2"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.009463 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" event={"ID":"4029e155-0c45-49cb-a25b-ddb1f768a88f","Type":"ContainerStarted","Data":"9f093c2a85a04f521c2a877046fcf50e1f5c8e8206e82076325a05516fba14aa"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.018655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.018958 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.518947826 +0000 UTC m=+116.506690692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.019642 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" podStartSLOduration=81.019618012 podStartE2EDuration="1m21.019618012s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.017463309 +0000 UTC m=+116.005206175" watchObservedRunningTime="2026-04-06 11:58:57.019618012 +0000 UTC m=+116.007360878" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.050731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ql4f9" event={"ID":"dc2f1a83-c329-47b9-98d3-08104ce2323c","Type":"ContainerStarted","Data":"326abde093580810f1307db8fd74c6a803cde3424f0a9d4e8da1e92b3c4990d2"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.053868 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.064098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" event={"ID":"e42a375c-23c0-471a-8a17-20a03aabc3d4","Type":"ContainerStarted","Data":"0ee97cc20094c81566863c1573165cf9e5c2c8f28993300c1653c2761c87dd9a"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.076328 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-ql4f9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.076371 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ql4f9" podUID="dc2f1a83-c329-47b9-98d3-08104ce2323c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.115662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerStarted","Data":"1d4f74726b195718c1b77cd759f41d317a574d1d1c2da5f81921301f2e991ecd"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.119652 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dkg9c" podStartSLOduration=81.119634699 podStartE2EDuration="1m21.119634699s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.117974678 +0000 UTC m=+116.105717544" watchObservedRunningTime="2026-04-06 11:58:57.119634699 +0000 UTC m=+116.107377565" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.120048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.121307 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.62128809 +0000 UTC m=+116.609030966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.209448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerStarted","Data":"a6590d7613bda71a20ff4999cb28a7b04b38d65e19773b49346f5c4b457fccc8"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.209498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerStarted","Data":"a1c968b9599cb42680bd83cd894d4fb2eda912d99cf19ac66d29031e15646e40"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.215690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" event={"ID":"a3adfac4-b370-4f06-a19e-640252120515","Type":"ContainerStarted","Data":"d02a66ab5bbec812e6f2694890152d7144704231d140953efc16e440461b36b9"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.216571 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.221910 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.225231 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.725212835 +0000 UTC m=+116.712955701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.229921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-895qp" event={"ID":"48539430-665f-4976-8ab6-7f06a26b9bde","Type":"ContainerStarted","Data":"3c61cdeb83c2d12a4745160a71fa25e18c874ff6f6fbad77347c6ba47bd07297"} Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.242765 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" podStartSLOduration=80.242734092 podStartE2EDuration="1m20.242734092s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.240690551 +0000 UTC m=+116.228433417" watchObservedRunningTime="2026-04-06 11:58:57.242734092 +0000 UTC m=+116.230476958" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.244456 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" podStartSLOduration=80.244447385 podStartE2EDuration="1m20.244447385s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.172710114 +0000 UTC m=+116.160452980" watchObservedRunningTime="2026-04-06 11:58:57.244447385 +0000 UTC m=+116.232190251" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.250141 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gq5tc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.250189 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" podUID="a3adfac4-b370-4f06-a19e-640252120515" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.253066 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.254840 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.320416 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" podStartSLOduration=80.320399611 podStartE2EDuration="1m20.320399611s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.278116656 +0000 UTC m=+116.265859522" watchObservedRunningTime="2026-04-06 11:58:57.320399611 +0000 UTC m=+116.308142477" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.322793 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.324959 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.824938884 +0000 UTC m=+116.812681750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.386265 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" podStartSLOduration=81.386245155 podStartE2EDuration="1m21.386245155s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.3247559 +0000 UTC m=+116.312498766" watchObservedRunningTime="2026-04-06 11:58:57.386245155 +0000 UTC m=+116.373988021" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.386863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podStartSLOduration=6.38685818 podStartE2EDuration="6.38685818s" podCreationTimestamp="2026-04-06 11:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.384911842 +0000 UTC m=+116.372654718" watchObservedRunningTime="2026-04-06 11:58:57.38685818 +0000 UTC m=+116.374601046" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.427348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.427721 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:57.92770686 +0000 UTC m=+116.915449726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.497980 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ql4f9" podStartSLOduration=81.497959774 podStartE2EDuration="1m21.497959774s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.458793996 +0000 UTC m=+116.446536862" watchObservedRunningTime="2026-04-06 11:58:57.497959774 +0000 UTC m=+116.485702640" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.528357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.528763 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.028747472 +0000 UTC m=+117.016490328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.579049 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" podStartSLOduration=80.579028887 podStartE2EDuration="1m20.579028887s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.542154427 +0000 UTC m=+116.529897293" watchObservedRunningTime="2026-04-06 11:58:57.579028887 +0000 UTC m=+116.566771753" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.630710 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.631131 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.131115698 +0000 UTC m=+117.118858564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.650456 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" podStartSLOduration=81.65043558 podStartE2EDuration="1m21.65043558s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.609384685 +0000 UTC m=+116.597127551" watchObservedRunningTime="2026-04-06 11:58:57.65043558 +0000 UTC m=+116.638178446" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.652469 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-895qp" podStartSLOduration=81.652460251 podStartE2EDuration="1m21.652460251s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:57.651988219 +0000 UTC m=+116.639731095" watchObservedRunningTime="2026-04-06 11:58:57.652460251 +0000 UTC m=+116.640203117" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.731661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.732014 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.231997906 +0000 UTC m=+117.219740772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.801266 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38510: no serving certificate available for the kubelet" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.834691 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.835153 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.335138441 +0000 UTC m=+117.322881307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.878281 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38522: no serving certificate available for the kubelet" Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.938548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:57 crc kubenswrapper[4790]: E0406 11:58:57.939010 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.438989644 +0000 UTC m=+117.426732500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:57 crc kubenswrapper[4790]: I0406 11:58:57.970360 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38530: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.042031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.042549 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.542531598 +0000 UTC m=+117.530274464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.090960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.091868 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38546: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.096148 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.096219 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.143737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.144240 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.644219017 +0000 UTC m=+117.631961883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.179655 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38562: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.245668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerStarted","Data":"1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.245716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerStarted","Data":"e6f2e4ed9bb5c0a3126085f5f6bb301b8e6db0997e1fc2b91088399a77dbbbd7"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.251530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.252152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.752139251 +0000 UTC m=+117.739882117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.264659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" event={"ID":"f44bc876-aaf0-4ef8-aec0-e7aed034f67f","Type":"ContainerStarted","Data":"a9f1c27785cbaef53d261be5911289181c616ccc5dd55f2d1c1527c8e5f618a4"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.265750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.266973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkf8s" event={"ID":"934f4d5f-3670-40da-b496-8b9f9f25fc0b","Type":"ContainerStarted","Data":"8c2a715c6eefc4a995ae0fb8dfad5f8dd0275eb261f15ead74d329ba6ea0b5d8"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.267016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkf8s" event={"ID":"934f4d5f-3670-40da-b496-8b9f9f25fc0b","Type":"ContainerStarted","Data":"c3051514c2259ceb66fbe98c39c081e4ae19f060f8934cca8d5326d1af327eb1"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.284111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" event={"ID":"4ff4177e-05ad-4a06-bf17-fa05ab567c5d","Type":"ContainerStarted","Data":"ce242f36661abfcaa4a9d60fa2c3096d3a6c5c462ffa935d9643f62322c2ccc6"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.284715 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.285090 4790 ???:1] "http: TLS handshake error from 192.168.126.11:38568: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.285364 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2mbnf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.285422 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" podUID="f44bc876-aaf0-4ef8-aec0-e7aed034f67f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.301175 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-trfn8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.301244 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" podUID="4ff4177e-05ad-4a06-bf17-fa05ab567c5d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.302540 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" podStartSLOduration=81.302531079 podStartE2EDuration="1m21.302531079s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.302473858 +0000 UTC m=+117.290216734" watchObservedRunningTime="2026-04-06 11:58:58.302531079 +0000 UTC m=+117.290273945" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.304615 4790 generic.go:334] "Generic (PLEG): container finished" podID="0360c312-3ecf-42c9-9af9-470c231eefbd" containerID="945b743b8738b6c7464883963675e8c16d2e14f57a697eb6dd604430b7fed28c" exitCode=0 Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.305443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" event={"ID":"0360c312-3ecf-42c9-9af9-470c231eefbd","Type":"ContainerDied","Data":"945b743b8738b6c7464883963675e8c16d2e14f57a697eb6dd604430b7fed28c"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.308223 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerStarted","Data":"8dcb032fb28181c105a1909b65757b322371120de591936c9a842adaae4cf508"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.312812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9ff6v" event={"ID":"418e1d4b-ebc4-48ff-89f1-e52375374d63","Type":"ContainerStarted","Data":"4f3ca42ed4d3f53633130a952108e90a1c848fdb11444c19b3d98b96a15cf750"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.312894 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9ff6v" event={"ID":"418e1d4b-ebc4-48ff-89f1-e52375374d63","Type":"ContainerStarted","Data":"94fbf248b88c5f0384323eeb31a6020eb6ac8da33605ad1b7f450b22fa18e87f"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.321407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerStarted","Data":"9be9af31af26552cc7ce1fb89500ce58525327d0a854d30edfed0d4bab1371ba"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.322417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.324392 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gzwvz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.324440 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.338332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" event={"ID":"6e5a800d-f198-4dbe-8c7b-a84e6c130041","Type":"ContainerStarted","Data":"670dc635f9b2cd9fedf51106a1b49b2e586a42feef4b281db3ee50e2c95f8844"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.351920 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" podStartSLOduration=81.351905972 podStartE2EDuration="1m21.351905972s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.349392199 +0000 UTC m=+117.337135075" watchObservedRunningTime="2026-04-06 11:58:58.351905972 +0000 UTC m=+117.339648838" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.352778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.353944 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.853928522 +0000 UTC m=+117.841671388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.367186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" event={"ID":"dbfe58a1-c1ad-4719-9676-e6dbaca5c530","Type":"ContainerStarted","Data":"57ed78c7d6faae904444d200d5771c17fea902d42eee1d7156666dd0951bf8de"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.367237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" event={"ID":"dbfe58a1-c1ad-4719-9676-e6dbaca5c530","Type":"ContainerStarted","Data":"889313c178e9cc340336e7d21a25fbb019b7ae4db3fd1abae4087d4229cfa815"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.385233 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" podStartSLOduration=82.385210683 podStartE2EDuration="1m22.385210683s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.380851174 +0000 UTC m=+117.368594040" watchObservedRunningTime="2026-04-06 11:58:58.385210683 +0000 UTC m=+117.372953549" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.387870 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.388102 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" event={"ID":"4029e155-0c45-49cb-a25b-ddb1f768a88f","Type":"ContainerStarted","Data":"574bee7f89a434a796c3dc9fc8f439107867110b7ceb45c4918ca3fd11bee774"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.415622 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.416802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" event={"ID":"e1aab57b-c8ed-492c-91d3-e190449713a0","Type":"ContainerStarted","Data":"93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.431502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpjz5" event={"ID":"763471f0-0557-46ae-9eaf-e4bfd2b737ad","Type":"ContainerStarted","Data":"6520c7e2551ccbc3e6534f0ab0d05b6beccaf6edc36376a272f7e8e2fd4a0b43"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.433807 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" event={"ID":"35f420a5-ec2f-4d37-94ea-af000df33824","Type":"ContainerStarted","Data":"59e933fe5c0dba46fc8bd1d9258ca453dd7527e43fd6f71d3310cae685c38b99"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.454081 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.457152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:58.957140109 +0000 UTC m=+117.944882975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.466807 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.475033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" event={"ID":"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f","Type":"ContainerStarted","Data":"977375ef20f324452de6a467a015308bc2d8832c3550b837bf2b5b8afff55644"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.475088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" event={"ID":"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f","Type":"ContainerStarted","Data":"1b3a31a47eee24c65f278e620196ca9469f11bee0b0858c4e49deff69c8a28b3"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.494031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerStarted","Data":"9bed722ea06b6b94cd6e52b7cc3fad9cba84bbca1ac99e9be694c0475736ca82"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.494457 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.500504 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.500561 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.501810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.505632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" event={"ID":"47d8d34a-e8af-4dcc-8e0f-bfc5fb139e70","Type":"ContainerStarted","Data":"c62de567605f006fd72d6d2632b3b6844b4454f10357f632faabb9ce2a1a2754"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.557143 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58292: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.557512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" event={"ID":"b8d6da06-7170-416d-8a93-b3f9fa890fb1","Type":"ContainerStarted","Data":"2cf679d471651fa71ef0e8c32307bf296df7ab92ef22394348cc5aa8acbea6a7"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.558280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.558441 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.058424797 +0000 UTC m=+118.046167663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.558658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.559861 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.059847263 +0000 UTC m=+118.047590129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.621338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" event={"ID":"1d3bc18a-ca66-44f2-9667-48dd85b638fe","Type":"ContainerStarted","Data":"e9d3bbf46daddb764961894bf1bef4406a86ed3ae92c0b61a6505ec1cf2dbd96"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.637707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-895qp" event={"ID":"48539430-665f-4976-8ab6-7f06a26b9bde","Type":"ContainerStarted","Data":"67c8e60bd4b0332c76808f7656f1b962928e1d17b7dfed9d369ee09bd2089f68"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.652640 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" event={"ID":"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7","Type":"ContainerStarted","Data":"187b5f046e3965145526abb279cc7307d17a5da8eff0be04cca2fd0e7af0df95"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.654843 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9ff6v" podStartSLOduration=7.654798503 podStartE2EDuration="7.654798503s" podCreationTimestamp="2026-04-06 11:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.500907961 +0000 UTC m=+117.488650837" watchObservedRunningTime="2026-04-06 11:58:58.654798503 +0000 UTC m=+117.642541369" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.655176 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" podStartSLOduration=82.655170832 podStartE2EDuration="1m22.655170832s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.65229814 +0000 UTC m=+117.640041016" watchObservedRunningTime="2026-04-06 11:58:58.655170832 +0000 UTC m=+117.642913698" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.671297 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" event={"ID":"470e540b-4e4b-4064-a729-b5e608f1b394","Type":"ContainerStarted","Data":"f1ad35c2b9fa5390adf92be0a3c47460bd68dbc458b79dcebd391eab7023a8ce"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.671344 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" event={"ID":"470e540b-4e4b-4064-a729-b5e608f1b394","Type":"ContainerStarted","Data":"e153944f447882bb5d421e094eb0a38e979a56e0b334bea867f79a25cb55bc7b"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.672443 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.672628 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.172607898 +0000 UTC m=+118.160350774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.672735 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.673064 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.173055369 +0000 UTC m=+118.160798235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.685230 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerStarted","Data":"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.708561 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" event={"ID":"3b6c6217-8764-459d-b2a8-c99272221fb1","Type":"ContainerStarted","Data":"6eed76b227fefebcdc51c0a933f0ea3aa094ee7e5ae0270f53f76251d07b8b5c"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.708636 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" event={"ID":"3b6c6217-8764-459d-b2a8-c99272221fb1","Type":"ContainerStarted","Data":"6222fdc0a2e8273cd716d3c47a6a6218612406d744fd432a50521540579c875b"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.728279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" event={"ID":"a3adfac4-b370-4f06-a19e-640252120515","Type":"ContainerStarted","Data":"f36c19fc9802c49bd9d92ed7ce3b3104e9b393dc6f82aac281b31c247549a5a9"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.729096 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gq5tc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.729192 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" podUID="a3adfac4-b370-4f06-a19e-640252120515" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.735653 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58298: no serving certificate available for the kubelet" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.743053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" event={"ID":"64798e8b-0f1d-48f9-ab27-369e6953a5cb","Type":"ContainerStarted","Data":"3c3093d318fb64c9955fa8bb2f5997cc3ebc944cbeb58c227669822b41bd3504"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.743105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" event={"ID":"64798e8b-0f1d-48f9-ab27-369e6953a5cb","Type":"ContainerStarted","Data":"86eb31eb1a0bf2e8f3e9422176732fe246573f78e0538a1865e7f60c83cff85c"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.777673 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.777792 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.277770283 +0000 UTC m=+118.265513149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.778247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.780450 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.280434979 +0000 UTC m=+118.268177845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.784101 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" event={"ID":"b4a6f2c5-27d9-42d3-8856-3517265316cf","Type":"ContainerStarted","Data":"073fc86f6ea769f4e479c6ae06363f24b36d4f37072b4cd0ca364233a770d971"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.798538 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerStarted","Data":"3d7b8c4ff25646700d78ac43f3a8ddeb965e16df9c7ff06db3c54b2afc53b930"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.798582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerStarted","Data":"564f7cde9f32ad898fb8551c2d36ab77d4bdfbfd37ed003d2e2421f4bce3a1b1"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.833937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" event={"ID":"e14e3a43-8228-4129-a69e-0ecfa3a7114c","Type":"ContainerStarted","Data":"372e59d5ece6fffffff69c410d6d5a7ac85b9dd571d47b51698c2f72c1b3397d"} Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.836037 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-ql4f9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.836078 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ql4f9" podUID="dc2f1a83-c329-47b9-98d3-08104ce2323c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.859765 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x74dz" podStartSLOduration=82.859748039 podStartE2EDuration="1m22.859748039s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.858595721 +0000 UTC m=+117.846338587" watchObservedRunningTime="2026-04-06 11:58:58.859748039 +0000 UTC m=+117.847490905" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.860846 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" podStartSLOduration=82.860822616 podStartE2EDuration="1m22.860822616s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.815241868 +0000 UTC m=+117.802984754" watchObservedRunningTime="2026-04-06 11:58:58.860822616 +0000 UTC m=+117.848565472" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.879183 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:58 crc kubenswrapper[4790]: E0406 11:58:58.879856 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.379842841 +0000 UTC m=+118.367585707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.898352 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sgt9w" podStartSLOduration=82.898337013 podStartE2EDuration="1m22.898337013s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.895372889 +0000 UTC m=+117.883115745" watchObservedRunningTime="2026-04-06 11:58:58.898337013 +0000 UTC m=+117.886079879" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.945095 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" podStartSLOduration=82.945062009 podStartE2EDuration="1m22.945062009s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.943565812 +0000 UTC m=+117.931308688" watchObservedRunningTime="2026-04-06 11:58:58.945062009 +0000 UTC m=+117.932804875" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.945465 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podStartSLOduration=82.945458299 podStartE2EDuration="1m22.945458299s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:58.921676115 +0000 UTC m=+117.909418981" watchObservedRunningTime="2026-04-06 11:58:58.945458299 +0000 UTC m=+117.933201175" Apr 06 11:58:58 crc kubenswrapper[4790]: I0406 11:58:58.980775 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.010544 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58302: no serving certificate available for the kubelet" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.017030 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.517002515 +0000 UTC m=+118.504745381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.081700 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.082018 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.582004288 +0000 UTC m=+118.569747154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.091815 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.092108 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.143863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" podStartSLOduration=83.143845491 podStartE2EDuration="1m23.143845491s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:59.115772161 +0000 UTC m=+118.103515027" watchObservedRunningTime="2026-04-06 11:58:59.143845491 +0000 UTC m=+118.131588357" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.144608 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rm8wl" podStartSLOduration=83.144602 podStartE2EDuration="1m23.144602s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:59.143211526 +0000 UTC m=+118.130954392" watchObservedRunningTime="2026-04-06 11:58:59.144602 +0000 UTC m=+118.132344866" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.184704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.185069 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.685056809 +0000 UTC m=+118.672799675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.285585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.286166 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.786151193 +0000 UTC m=+118.773894059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.386842 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.387116 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.887106163 +0000 UTC m=+118.874849019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.414359 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lklpx"] Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.487405 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.487700 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:58:59.987684404 +0000 UTC m=+118.975427270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.588415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.588699 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.088686865 +0000 UTC m=+119.076429731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.690302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.690527 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.190479287 +0000 UTC m=+119.178222163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.690982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.691550 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.191444451 +0000 UTC m=+119.179187307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.792258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.792863 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.292808151 +0000 UTC m=+119.280551017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.843718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" event={"ID":"0360c312-3ecf-42c9-9af9-470c231eefbd","Type":"ContainerStarted","Data":"9677322186c8c5cb15b094b9ba08a52db11ff46b1b90896f9e5ae01e60443c62"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.844990 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.847678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" event={"ID":"a88e6363-c095-4450-98d2-18808abd10a9","Type":"ContainerStarted","Data":"6c88b464261093f50b768d9fa8b58a06884c26fe35f0d19b4eb42c8fedc6411d"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.849928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerStarted","Data":"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.853856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" event={"ID":"3b6c6217-8764-459d-b2a8-c99272221fb1","Type":"ContainerStarted","Data":"866259ebedd6c2ee8c928cf7e3091bea3da404afaddc867e926c572f774689a9"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.857780 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qkf8s" event={"ID":"934f4d5f-3670-40da-b496-8b9f9f25fc0b","Type":"ContainerStarted","Data":"96eeb29176f8b3515267d041a9427111a36faab6c2bb4a8cce2c07eacd8f4dd5"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.863511 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" event={"ID":"dbfe58a1-c1ad-4719-9676-e6dbaca5c530","Type":"ContainerStarted","Data":"e0ea7e805f57738b3b28dfe6416376e7c2a52e524373df82ba11f92cee8c70fe"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.865178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" event={"ID":"64798e8b-0f1d-48f9-ab27-369e6953a5cb","Type":"ContainerStarted","Data":"455d470cd2dcea3d3e379252ec47730541510e9af83a192c1bf2f686f33d39f2"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.867005 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" event={"ID":"e14e3a43-8228-4129-a69e-0ecfa3a7114c","Type":"ContainerStarted","Data":"1b1a7a8829e19fa3189b079c74f62dc2f36d75ebe9f64c921f3ef161a5075e81"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.868812 4790 generic.go:334] "Generic (PLEG): container finished" podID="e42a375c-23c0-471a-8a17-20a03aabc3d4" containerID="0ee97cc20094c81566863c1573165cf9e5c2c8f28993300c1653c2761c87dd9a" exitCode=0 Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.868889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" event={"ID":"e42a375c-23c0-471a-8a17-20a03aabc3d4","Type":"ContainerDied","Data":"0ee97cc20094c81566863c1573165cf9e5c2c8f28993300c1653c2761c87dd9a"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.871688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpjz5" event={"ID":"763471f0-0557-46ae-9eaf-e4bfd2b737ad","Type":"ContainerStarted","Data":"3e514fb025d8aa2249b9d049a6f546e82d5239ec684b8f5489bfa63f7d983757"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.871716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpjz5" event={"ID":"763471f0-0557-46ae-9eaf-e4bfd2b737ad","Type":"ContainerStarted","Data":"3ef5d3233d0ed32209e0455d7c5fe6a4cd199425a9cff0897896836906be563c"} Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.874008 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2mbnf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.874056 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" podUID="f44bc876-aaf0-4ef8-aec0-e7aed034f67f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.875568 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" podUID="53544b57-db90-4281-a2ee-ba4ceceb1605" containerName="route-controller-manager" containerID="cri-o://56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778" gracePeriod=30 Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.875954 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-trfn8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.875980 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" podUID="4ff4177e-05ad-4a06-bf17-fa05ab567c5d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.876108 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerName="controller-manager" containerID="cri-o://47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c" gracePeriod=30 Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.876116 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gzwvz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.876174 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.876768 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-ql4f9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.877011 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ql4f9" podUID="dc2f1a83-c329-47b9-98d3-08104ce2323c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.877278 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.877295 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": dial tcp 10.217.0.41:8443: connect: connection refused" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.894124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.894765 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.394750756 +0000 UTC m=+119.382493622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.896875 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" podStartSLOduration=83.896843818 podStartE2EDuration="1m23.896843818s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:59.892010458 +0000 UTC m=+118.879753334" watchObservedRunningTime="2026-04-06 11:58:59.896843818 +0000 UTC m=+118.884586684" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.964927 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" podStartSLOduration=82.964888297 podStartE2EDuration="1m22.964888297s" podCreationTimestamp="2026-04-06 11:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:59.962226611 +0000 UTC m=+118.949969497" watchObservedRunningTime="2026-04-06 11:58:59.964888297 +0000 UTC m=+118.952631163" Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.995640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:58:59 crc kubenswrapper[4790]: E0406 11:58:59.996080 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.496027184 +0000 UTC m=+119.483770050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:58:59 crc kubenswrapper[4790]: I0406 11:58:59.996657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.015297 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rglnr" podStartSLOduration=84.015280225 podStartE2EDuration="1m24.015280225s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:58:59.990602639 +0000 UTC m=+118.978345505" watchObservedRunningTime="2026-04-06 11:59:00.015280225 +0000 UTC m=+119.003023091" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.016522 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.502874575 +0000 UTC m=+119.490617631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.016708 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg5f8" podStartSLOduration=84.01670016 podStartE2EDuration="1m24.01670016s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.014017274 +0000 UTC m=+119.001760140" watchObservedRunningTime="2026-04-06 11:59:00.01670016 +0000 UTC m=+119.004443026" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.051401 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podStartSLOduration=84.051371286 podStartE2EDuration="1m24.051371286s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.04953132 +0000 UTC m=+119.037274206" watchObservedRunningTime="2026-04-06 11:59:00.051371286 +0000 UTC m=+119.039114162" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.093391 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" podStartSLOduration=84.093356424 podStartE2EDuration="1m24.093356424s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.088733009 +0000 UTC m=+119.076475875" watchObservedRunningTime="2026-04-06 11:59:00.093356424 +0000 UTC m=+119.081099290" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.107979 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.108596 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.608573334 +0000 UTC m=+119.596316200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.093787 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:00 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:00 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:00 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.111298 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.173276 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f78dq" podStartSLOduration=84.173242888 podStartE2EDuration="1m24.173242888s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.17171915 +0000 UTC m=+119.159462046" watchObservedRunningTime="2026-04-06 11:59:00.173242888 +0000 UTC m=+119.160985754" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.217544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.218139 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.718119359 +0000 UTC m=+119.705862225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.276671 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ggnlt" podStartSLOduration=84.27663823 podStartE2EDuration="1m24.27663823s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.203885103 +0000 UTC m=+119.191627969" watchObservedRunningTime="2026-04-06 11:59:00.27663823 +0000 UTC m=+119.264381096" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.318925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.320068 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.820043863 +0000 UTC m=+119.807786719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.322551 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" podStartSLOduration=84.322527925 podStartE2EDuration="1m24.322527925s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.280917026 +0000 UTC m=+119.268659902" watchObservedRunningTime="2026-04-06 11:59:00.322527925 +0000 UTC m=+119.310270791" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.385890 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gq5tc" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.409360 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58306: no serving certificate available for the kubelet" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.424036 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.425432 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:00.925403223 +0000 UTC m=+119.913146079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.491676 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qkf8s" podStartSLOduration=84.491657947 podStartE2EDuration="1m24.491657947s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.322981476 +0000 UTC m=+119.310724362" watchObservedRunningTime="2026-04-06 11:59:00.491657947 +0000 UTC m=+119.479400813" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.527726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.527880 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.027852921 +0000 UTC m=+120.015595787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.528086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.528735 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.028714362 +0000 UTC m=+120.016457228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.530617 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.629463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config\") pod \"53544b57-db90-4281-a2ee-ba4ceceb1605\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.629855 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca\") pod \"53544b57-db90-4281-a2ee-ba4ceceb1605\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.629971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.630009 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert\") pod \"53544b57-db90-4281-a2ee-ba4ceceb1605\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.630036 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chng2\" (UniqueName: \"kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2\") pod \"53544b57-db90-4281-a2ee-ba4ceceb1605\" (UID: \"53544b57-db90-4281-a2ee-ba4ceceb1605\") " Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.631088 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.131038067 +0000 UTC m=+120.118780943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.636764 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config" (OuterVolumeSpecName: "config") pod "53544b57-db90-4281-a2ee-ba4ceceb1605" (UID: "53544b57-db90-4281-a2ee-ba4ceceb1605"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.637477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca" (OuterVolumeSpecName: "client-ca") pod "53544b57-db90-4281-a2ee-ba4ceceb1605" (UID: "53544b57-db90-4281-a2ee-ba4ceceb1605"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.644138 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53544b57-db90-4281-a2ee-ba4ceceb1605" (UID: "53544b57-db90-4281-a2ee-ba4ceceb1605"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.674302 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2" (OuterVolumeSpecName: "kube-api-access-chng2") pod "53544b57-db90-4281-a2ee-ba4ceceb1605" (UID: "53544b57-db90-4281-a2ee-ba4ceceb1605"). InnerVolumeSpecName "kube-api-access-chng2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.732427 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.732568 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.732587 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53544b57-db90-4281-a2ee-ba4ceceb1605-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.732603 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53544b57-db90-4281-a2ee-ba4ceceb1605-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.732616 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chng2\" (UniqueName: \"kubernetes.io/projected/53544b57-db90-4281-a2ee-ba4ceceb1605-kube-api-access-chng2\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.732848 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.232813697 +0000 UTC m=+120.220556563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.745128 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833544 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833598 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca\") pod \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles\") pod \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h6ch\" (UniqueName: \"kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch\") pod \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert\") pod \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.833751 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config\") pod \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\" (UID: \"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60\") " Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.834698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" (UID: "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.834791 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config" (OuterVolumeSpecName: "config") pod "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" (UID: "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.834805 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.334788053 +0000 UTC m=+120.322530919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.835042 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca" (OuterVolumeSpecName: "client-ca") pod "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" (UID: "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.841666 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" (UID: "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.843383 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch" (OuterVolumeSpecName: "kube-api-access-7h6ch") pod "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" (UID: "0967d5bd-4fa1-4e9c-a58a-1cd171f56b60"). InnerVolumeSpecName "kube-api-access-7h6ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.876649 4790 generic.go:334] "Generic (PLEG): container finished" podID="53544b57-db90-4281-a2ee-ba4ceceb1605" containerID="56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778" exitCode=0 Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.876745 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.877282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" event={"ID":"53544b57-db90-4281-a2ee-ba4ceceb1605","Type":"ContainerDied","Data":"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778"} Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.877337 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th" event={"ID":"53544b57-db90-4281-a2ee-ba4ceceb1605","Type":"ContainerDied","Data":"4481bf919e5353bb48bbcd4bd929dfec2012c92a49c6af2c5ebad00cf5177874"} Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.877358 4790 scope.go:117] "RemoveContainer" containerID="56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.879238 4790 generic.go:334] "Generic (PLEG): container finished" podID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerID="47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c" exitCode=0 Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.879447 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.882201 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" event={"ID":"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60","Type":"ContainerDied","Data":"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c"} Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.882229 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x9f69" event={"ID":"0967d5bd-4fa1-4e9c-a58a-1cd171f56b60","Type":"ContainerDied","Data":"3b2c84778aecb70b5372eef1fd0a7e71e7c660c0d6c70e363d69aae11d82033b"} Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.882801 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gzwvz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.882857 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.885581 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" gracePeriod=30 Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.919615 4790 scope.go:117] "RemoveContainer" containerID="56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.922661 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778\": container with ID starting with 56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778 not found: ID does not exist" containerID="56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.922721 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778"} err="failed to get container status \"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778\": rpc error: code = NotFound desc = could not find container \"56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778\": container with ID starting with 56fb40338a42246d5e5f1afdd4d6f390cd9face848bc7a3f1b47fd6b231dc778 not found: ID does not exist" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.922770 4790 scope.go:117] "RemoveContainer" containerID="47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939531 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939558 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939573 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h6ch\" (UniqueName: \"kubernetes.io/projected/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-kube-api-access-7h6ch\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939588 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.939613 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.941437 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.441421575 +0000 UTC m=+120.429164431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.953237 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fpjz5" podStartSLOduration=9.95321403 podStartE2EDuration="9.95321403s" podCreationTimestamp="2026-04-06 11:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:00.924874612 +0000 UTC m=+119.912617488" watchObservedRunningTime="2026-04-06 11:59:00.95321403 +0000 UTC m=+119.940956896" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.957604 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.961021 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x9f69"] Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.969640 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.973137 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9t8th"] Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.975749 4790 scope.go:117] "RemoveContainer" containerID="47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c" Apr 06 11:59:00 crc kubenswrapper[4790]: E0406 11:59:00.979406 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c\": container with ID starting with 47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c not found: ID does not exist" containerID="47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c" Apr 06 11:59:00 crc kubenswrapper[4790]: I0406 11:59:00.979441 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c"} err="failed to get container status \"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c\": rpc error: code = NotFound desc = could not find container \"47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c\": container with ID starting with 47451d81d2dc2a20b88391452e297a8580ac5fa9e63b1aabf93562d4c91c653c not found: ID does not exist" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.040803 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.041091 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.541077933 +0000 UTC m=+120.528820799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.103924 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:01 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:01 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:01 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.104021 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.143216 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.143507 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.64349704 +0000 UTC m=+120.631239906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.202069 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.244373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.244590 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.744563303 +0000 UTC m=+120.732306169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.244728 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.245059 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.745045435 +0000 UTC m=+120.732788301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.279318 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fpjz5" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.346074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.346174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv8gh\" (UniqueName: \"kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh\") pod \"e42a375c-23c0-471a-8a17-20a03aabc3d4\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.346234 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.84620326 +0000 UTC m=+120.833946116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.346310 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") pod \"e42a375c-23c0-471a-8a17-20a03aabc3d4\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.346416 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume\") pod \"e42a375c-23c0-471a-8a17-20a03aabc3d4\" (UID: \"e42a375c-23c0-471a-8a17-20a03aabc3d4\") " Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.346772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.347203 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e42a375c-23c0-471a-8a17-20a03aabc3d4" (UID: "e42a375c-23c0-471a-8a17-20a03aabc3d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.347265 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.847247856 +0000 UTC m=+120.834990722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.355071 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e42a375c-23c0-471a-8a17-20a03aabc3d4" (UID: "e42a375c-23c0-471a-8a17-20a03aabc3d4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.356198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh" (OuterVolumeSpecName: "kube-api-access-gv8gh") pod "e42a375c-23c0-471a-8a17-20a03aabc3d4" (UID: "e42a375c-23c0-471a-8a17-20a03aabc3d4"). InnerVolumeSpecName "kube-api-access-gv8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.393313 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.393819 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42a375c-23c0-471a-8a17-20a03aabc3d4" containerName="collect-profiles" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.393884 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42a375c-23c0-471a-8a17-20a03aabc3d4" containerName="collect-profiles" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.393897 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerName="controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.393906 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerName="controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.393916 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53544b57-db90-4281-a2ee-ba4ceceb1605" containerName="route-controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.393924 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="53544b57-db90-4281-a2ee-ba4ceceb1605" containerName="route-controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.394062 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42a375c-23c0-471a-8a17-20a03aabc3d4" containerName="collect-profiles" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.394082 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" containerName="controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.394091 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="53544b57-db90-4281-a2ee-ba4ceceb1605" containerName="route-controller-manager" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.394520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.407341 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.407503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.407744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.407856 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.407982 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.415314 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.420950 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.449390 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.449639 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv8gh\" (UniqueName: \"kubernetes.io/projected/e42a375c-23c0-471a-8a17-20a03aabc3d4-kube-api-access-gv8gh\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.449656 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e42a375c-23c0-471a-8a17-20a03aabc3d4-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.449665 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e42a375c-23c0-471a-8a17-20a03aabc3d4-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.449744 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:01.949725354 +0000 UTC m=+120.937468210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551287 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvq4\" (UniqueName: \"kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.551382 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.551722 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.05169756 +0000 UTC m=+121.039440626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.552321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.558910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.566506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.573414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.617357 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.629123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.660994 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.669523 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.669679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.669766 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.669791 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.669806 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvq4\" (UniqueName: \"kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.670127 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.170113176 +0000 UTC m=+121.157856032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.674142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.674368 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.674489 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.686148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.691360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.692483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.704615 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.712516 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0967d5bd-4fa1-4e9c-a58a-1cd171f56b60" path="/var/lib/kubelet/pods/0967d5bd-4fa1-4e9c-a58a-1cd171f56b60/volumes" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.713532 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53544b57-db90-4281-a2ee-ba4ceceb1605" path="/var/lib/kubelet/pods/53544b57-db90-4281-a2ee-ba4ceceb1605/volumes" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.720747 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.730715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvq4\" (UniqueName: \"kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4\") pod \"route-controller-manager-648487d594-ktnqn\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.776498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.777049 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.277024515 +0000 UTC m=+121.264767381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.881243 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.881819 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.38177299 +0000 UTC m=+121.369515856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.881915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.882702 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.382694873 +0000 UTC m=+121.370437739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.904329 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.905043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k" event={"ID":"e42a375c-23c0-471a-8a17-20a03aabc3d4","Type":"ContainerDied","Data":"9a8fc9c5deb21a6a25d87a8216ea9c1683200c00e3a703891b57a07f449357bc"} Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.905130 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8fc9c5deb21a6a25d87a8216ea9c1683200c00e3a703891b57a07f449357bc" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.928864 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" event={"ID":"b8d6da06-7170-416d-8a93-b3f9fa890fb1","Type":"ContainerStarted","Data":"d73eef0dd6a3450552d9062f70c97efe6ce398f7ce95883c0aef5873cb256763"} Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.943154 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.950642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.954609 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.957164 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 11:59:01 crc kubenswrapper[4790]: I0406 11:59:01.984262 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:01 crc kubenswrapper[4790]: E0406 11:59:01.984917 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.484897554 +0000 UTC m=+121.472640420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.020119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.025034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.085569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcwbn\" (UniqueName: \"kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.085610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.085661 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.085715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.087420 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.587404383 +0000 UTC m=+121.575147249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.140604 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:02 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:02 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:02 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.140650 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.143267 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.144500 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.150208 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.174493 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.187333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.188043 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.688010655 +0000 UTC m=+121.675753521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.201429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcwbn\" (UniqueName: \"kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.201496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.201566 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.201682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.202203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.202861 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.702847545 +0000 UTC m=+121.690590411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.203240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.290882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcwbn\" (UniqueName: \"kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn\") pod \"community-operators-96trx\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.291380 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96trx" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.304761 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.304967 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.804931114 +0000 UTC m=+121.792673980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.305464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.305699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.305792 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdm8v\" (UniqueName: \"kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.305892 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.306235 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.806221036 +0000 UTC m=+121.793963902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.354860 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.365057 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.393220 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.407382 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.407719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdm8v\" (UniqueName: \"kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.407807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.407846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.408244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.408474 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.408489 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:02.908471118 +0000 UTC m=+121.896213984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.438612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdm8v\" (UniqueName: \"kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v\") pod \"certified-operators-msclk\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.510621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.511153 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.511200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.511235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j57r8\" (UniqueName: \"kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.511700 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.011684585 +0000 UTC m=+121.999427451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.535020 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msclk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.556774 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.578333 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.583252 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.587017 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.587236 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" podUID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" containerName="service-ca-controller" containerID="cri-o://187b5f046e3965145526abb279cc7307d17a5da8eff0be04cca2fd0e7af0df95" gracePeriod=30 Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.612020 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.612286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.612321 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.612356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j57r8\" (UniqueName: \"kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.612677 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.112657576 +0000 UTC m=+122.100400442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.613126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.613395 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.617007 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.649979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j57r8\" (UniqueName: \"kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8\") pod \"community-operators-4mfnk\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.713965 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.714103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.714140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.714189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn94d\" (UniqueName: \"kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.714489 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.214470327 +0000 UTC m=+122.202213193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.755531 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.799681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.819699 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.820783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.820862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn94d\" (UniqueName: \"kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.820919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.822950 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.322921314 +0000 UTC m=+122.310664180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.823308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.826058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.866225 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn94d\" (UniqueName: \"kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d\") pod \"certified-operators-2hfrm\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.889453 4790 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.922870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:02 crc kubenswrapper[4790]: E0406 11:59:02.923335 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.42331774 +0000 UTC m=+122.411060606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.945105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" event={"ID":"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d","Type":"ContainerStarted","Data":"3527da482926eda6bf451666c3e4d03e56309626afb7d262ff32acc3d421427a"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.952654 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" containerID="187b5f046e3965145526abb279cc7307d17a5da8eff0be04cca2fd0e7af0df95" exitCode=0 Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.952738 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" event={"ID":"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7","Type":"ContainerDied","Data":"187b5f046e3965145526abb279cc7307d17a5da8eff0be04cca2fd0e7af0df95"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.961174 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.970845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e1c3fb0b69ce29fafe755704f1fec58a48040dab12d8ebbb6ca2d7fc64599f66"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.970908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"91f3522589ab0d859f77f3180b0d218387d213b4ccea0cc7b31149fd58003c8b"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.971954 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.978748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4a0bd5639635562a8e874e0c8e43bb7c9d84da7e2f40de1dbec78056aa6063a9"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.982092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" event={"ID":"b8d6da06-7170-416d-8a93-b3f9fa890fb1","Type":"ContainerStarted","Data":"f5196e78044c25943c5e335e47d091a56232edbbd3c5db4caac7a6cc7092f262"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.982405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" event={"ID":"b8d6da06-7170-416d-8a93-b3f9fa890fb1","Type":"ContainerStarted","Data":"9183154b5bfc8011471ffae6d648af103730d3c2a8e239b4d5ad5419e1ead602"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.983800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e891b0b14165951588d09e065833da429d6b2eab1c1f43f4b260020d065970ed"} Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.984017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8430c99a6e24492a7f400316bd041481962392be45e6e2dc28f91196e6cd2241"} Apr 06 11:59:02 crc kubenswrapper[4790]: W0406 11:59:02.990097 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9685a39_63cf_47d3_b5fe_9113d55676d4.slice/crio-fcf639a4b4c5e6664b7789ea1435541473ab7464b764b33cb3764a8c60b6cbec WatchSource:0}: Error finding container fcf639a4b4c5e6664b7789ea1435541473ab7464b764b33cb3764a8c60b6cbec: Status 404 returned error can't find the container with id fcf639a4b4c5e6664b7789ea1435541473ab7464b764b33cb3764a8c60b6cbec Apr 06 11:59:02 crc kubenswrapper[4790]: I0406 11:59:02.993025 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerStarted","Data":"f5a8de717da6620c134ae7d40ab3a6e67db14123ae2d8961ebd3a69fb8465293"} Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.023712 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58322: no serving certificate available for the kubelet" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.024024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.024125 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.524104726 +0000 UTC m=+122.511847592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.024263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.024548 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.524538047 +0000 UTC m=+122.512280913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.057591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.072032 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.072890 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.075665 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.081485 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.081691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.097647 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:03 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:03 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:03 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.097714 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.126164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.131526 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.631498837 +0000 UTC m=+122.619241703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.189763 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.218440 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.230133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.230249 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.230306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.230811 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.730793746 +0000 UTC m=+122.718536612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332203 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") pod \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332297 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") pod \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332447 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jlkp\" (UniqueName: \"kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp\") pod \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\" (UID: \"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.332660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.333790 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" (UID: "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.333997 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.833957171 +0000 UTC m=+122.821700037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.334551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.340349 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp" (OuterVolumeSpecName: "kube-api-access-2jlkp") pod "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" (UID: "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7"). InnerVolumeSpecName "kube-api-access-2jlkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.350220 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key" (OuterVolumeSpecName: "signing-key") pod "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" (UID: "5ee57c19-af8c-4e40-ae89-9cfb07ad24d7"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.352954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.383919 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.384269 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" containerName="service-ca-controller" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.384295 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" containerName="service-ca-controller" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.384431 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" containerName="service-ca-controller" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.384970 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.387238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.387636 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.387806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.388240 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.388400 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.393207 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.393370 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.395967 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.396548 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 11:59:03 crc kubenswrapper[4790]: W0406 11:59:03.416083 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc176f8e8_2902_4da6_b779_d0426b68e715.slice/crio-e89d54719360f164ec11aa844316310401b8f2f9b3e0b2151a4d19ea448cc585 WatchSource:0}: Error finding container e89d54719360f164ec11aa844316310401b8f2f9b3e0b2151a4d19ea448cc585: Status 404 returned error can't find the container with id e89d54719360f164ec11aa844316310401b8f2f9b3e0b2151a4d19ea448cc585 Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.434647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.434722 4790 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-key\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.434739 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jlkp\" (UniqueName: \"kubernetes.io/projected/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-kube-api-access-2jlkp\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.434753 4790 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7-signing-cabundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.435091 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:03.935077836 +0000 UTC m=+122.922820702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.436766 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.535668 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.535911 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.03581587 +0000 UTC m=+123.023558736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.535975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.536107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.536180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.536250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.536324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfg5c\" (UniqueName: \"kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.536416 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.536680 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.036667942 +0000 UTC m=+123.024410808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.637479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.638006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.638040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.638055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.638077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfg5c\" (UniqueName: \"kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.638137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.640107 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.640207 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.140160615 +0000 UTC m=+123.127903481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.640374 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.642578 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.652138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.666382 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfg5c\" (UniqueName: \"kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c\") pod \"controller-manager-67ff4c58b7-wktt2\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.737109 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.739467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.739794 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.239779262 +0000 UTC m=+123.227522138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.823928 4790 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-04-06T11:59:02.889477195Z","Handler":null,"Name":""} Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.841492 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.841649 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.341624335 +0000 UTC m=+123.329367201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.841788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:03 crc kubenswrapper[4790]: E0406 11:59:03.842235 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-04-06 11:59:04.34222538 +0000 UTC m=+123.329968246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-l7zk6" (UID: "9dd58353-6603-48cf-9767-11235ba23164") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.845667 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.859155 4790 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.859207 4790 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.928329 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.929343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.933955 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.949647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.956063 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.969972 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 11:59:03 crc kubenswrapper[4790]: I0406 11:59:03.980348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.004461 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.020446 4790 generic.go:334] "Generic (PLEG): container finished" podID="c176f8e8-2902-4da6-b779-d0426b68e715" containerID="504ee680158d73e888a8ee328def1f97e4a6eae392fdbf5184f050b6c825c79c" exitCode=0 Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.020580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerDied","Data":"504ee680158d73e888a8ee328def1f97e4a6eae392fdbf5184f050b6c825c79c"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.020648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerStarted","Data":"e89d54719360f164ec11aa844316310401b8f2f9b3e0b2151a4d19ea448cc585"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.024341 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.026943 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.028063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kp5kl" event={"ID":"5ee57c19-af8c-4e40-ae89-9cfb07ad24d7","Type":"ContainerDied","Data":"e7b1b8d395f94018f9b8fc8c41b587923e6d8fbbe150c8dc432128012d3c0793"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.028148 4790 scope.go:117] "RemoveContainer" containerID="187b5f046e3965145526abb279cc7307d17a5da8eff0be04cca2fd0e7af0df95" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.032387 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" event={"ID":"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d","Type":"ContainerStarted","Data":"aefb2d3634473472176473549965b9b8eff15ec9ba7adc8b00e32224cc1995f1"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.032816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.039098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dbf7adf1-0126-47d9-8ddf-e4d329eda260","Type":"ContainerStarted","Data":"563bc31d634579ac83d8eb1ed14e07c43faf0ed5b49f804e6dd5c7c5c613e3b0"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.048995 4790 generic.go:334] "Generic (PLEG): container finished" podID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerID="c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b" exitCode=0 Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.049119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerDied","Data":"c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.049170 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerStarted","Data":"fcf639a4b4c5e6664b7789ea1435541473ab7464b764b33cb3764a8c60b6cbec"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.051179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.051271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.051307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ndv\" (UniqueName: \"kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.051355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.053760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a09689be27c5106504a86effad7f0a6a45face79ffd60954d4fd70fea044d40d"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.054488 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.054532 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.055134 4790 generic.go:334] "Generic (PLEG): container finished" podID="25067a9b-e553-4a7a-abdb-226567079c15" containerID="a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8" exitCode=0 Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.055188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerDied","Data":"a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.055204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerStarted","Data":"cfd05a1e3b46d3495738ac36fab286aa110ae634691e8425424daf167cbabed3"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.059988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" event={"ID":"b8d6da06-7170-416d-8a93-b3f9fa890fb1","Type":"ContainerStarted","Data":"625e0c63d9e1f01ede75fabfd637e4ca12852ffd9264f549429d75b8fc9507d4"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.060515 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" podStartSLOduration=5.060504869 podStartE2EDuration="5.060504869s" podCreationTimestamp="2026-04-06 11:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:04.05895844 +0000 UTC m=+123.046701326" watchObservedRunningTime="2026-04-06 11:59:04.060504869 +0000 UTC m=+123.048247735" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.067969 4790 generic.go:334] "Generic (PLEG): container finished" podID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerID="f51dd7c3048ce8808968f63a200bec353b5bcf8f91bb149d8d5301374d26ef8e" exitCode=0 Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.069158 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerDied","Data":"f51dd7c3048ce8808968f63a200bec353b5bcf8f91bb149d8d5301374d26ef8e"} Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.073590 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.077528 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kp5kl"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.095814 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:04 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:04 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:04 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.095873 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.104641 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-7c84bc48c-99p72"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.113059 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.110569 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-l7zk6\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.120216 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-7c84bc48c-99p72"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.125682 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.125846 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.125980 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.126051 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.126169 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.157161 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.157214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ndv\" (UniqueName: \"kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.157256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.159940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.161290 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.182324 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.189091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ndv\" (UniqueName: \"kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv\") pod \"redhat-marketplace-42rgx\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.214893 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gz7ps" podStartSLOduration=13.214864722 podStartE2EDuration="13.214864722s" podCreationTimestamp="2026-04-06 11:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:04.212053112 +0000 UTC m=+123.199795978" watchObservedRunningTime="2026-04-06 11:59:04.214864722 +0000 UTC m=+123.202607588" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.261161 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-key\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.261480 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-cabundle\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.261605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bs9\" (UniqueName: \"kubernetes.io/projected/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-kube-api-access-x7bs9\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.262177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.267080 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.267521 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.285117 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]log ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]etcd ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 11:59:04 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 11:59:04 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 11:59:04 crc kubenswrapper[4790]: livez check failed Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.285811 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.292987 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.299082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.300232 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-ql4f9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.300390 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ql4f9" podUID="dc2f1a83-c329-47b9-98d3-08104ce2323c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.300237 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-ql4f9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.300623 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ql4f9" podUID="dc2f1a83-c329-47b9-98d3-08104ce2323c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.316194 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.316250 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.334644 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.338406 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.344883 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.372920 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-key\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.376613 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-cabundle\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.376680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bs9\" (UniqueName: \"kubernetes.io/projected/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-kube-api-access-x7bs9\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.373562 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.390333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-key\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.391682 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-signing-cabundle\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.415911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bs9\" (UniqueName: \"kubernetes.io/projected/58da2fd1-160f-42ef-b4a0-18dbe05a2cf3-kube-api-access-x7bs9\") pod \"service-ca-7c84bc48c-99p72\" (UID: \"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3\") " pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.473268 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-7c84bc48c-99p72" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.492589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzc9\" (UniqueName: \"kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.492688 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.492811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.594977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.595519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.595571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzc9\" (UniqueName: \"kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.599323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.600984 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.656922 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.657994 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.660255 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.664945 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzc9\" (UniqueName: \"kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9\") pod \"redhat-marketplace-klccw\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.673525 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.673881 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.677452 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.678249 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.685355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.685383 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.704715 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.711207 4790 patch_prober.go:28] interesting pod/console-f9d7485db-dkg9c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.711281 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dkg9c" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.797536 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.797620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.801183 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2mbnf" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.803750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.831930 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.831911257 podStartE2EDuration="831.911257ms" podCreationTimestamp="2026-04-06 11:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:04.826054141 +0000 UTC m=+123.813797007" watchObservedRunningTime="2026-04-06 11:59:04.831911257 +0000 UTC m=+123.819654123" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.848673 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 11:59:04 crc kubenswrapper[4790]: W0406 11:59:04.858896 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30678d8_35eb_4863_a856_096864c2a9b1.slice/crio-7570d0ab83fef6b86563b7a0e5579496e4cbea0cdf47886acee1918dd236cb39 WatchSource:0}: Error finding container 7570d0ab83fef6b86563b7a0e5579496e4cbea0cdf47886acee1918dd236cb39: Status 404 returned error can't find the container with id 7570d0ab83fef6b86563b7a0e5579496e4cbea0cdf47886acee1918dd236cb39 Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.896588 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-trfn8" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.898554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.898638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.900118 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.931150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:04 crc kubenswrapper[4790]: I0406 11:59:04.932343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:04.996111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.086918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.086964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" event={"ID":"78a77943-529d-499f-bf5b-3c2dd7451089","Type":"ContainerStarted","Data":"c2ff9f9afbaad6c12b6e19cba2b11a132bc9e9f3bee50e1b26007f368001d036"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.086983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" event={"ID":"78a77943-529d-499f-bf5b-3c2dd7451089","Type":"ContainerStarted","Data":"06667daa053be4288a05ebfb2a5b9e61116509d26981eca9316f5a84185689e5"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.087355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.089539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.092308 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbf7adf1-0126-47d9-8ddf-e4d329eda260" containerID="31ceca7913864e849b88c19acbd9f97dc58bb725b14fcd8f3d27b3a75b1076cf" exitCode=0 Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.092458 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dbf7adf1-0126-47d9-8ddf-e4d329eda260","Type":"ContainerDied","Data":"31ceca7913864e849b88c19acbd9f97dc58bb725b14fcd8f3d27b3a75b1076cf"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.093736 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:05 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:05 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:05 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.093783 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.097028 4790 generic.go:334] "Generic (PLEG): container finished" podID="a30678d8-35eb-4863-a856-096864c2a9b1" containerID="811a0ac01c70ff2567c76cd756cf3f97f3e733c4d738ebed2962f868b51cd02f" exitCode=0 Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.097528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerDied","Data":"811a0ac01c70ff2567c76cd756cf3f97f3e733c4d738ebed2962f868b51cd02f"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.097689 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerStarted","Data":"7570d0ab83fef6b86563b7a0e5579496e4cbea0cdf47886acee1918dd236cb39"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.100582 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.115910 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" podStartSLOduration=6.115887967 podStartE2EDuration="6.115887967s" podCreationTimestamp="2026-04-06 11:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:05.114055951 +0000 UTC m=+124.101798827" watchObservedRunningTime="2026-04-06 11:59:05.115887967 +0000 UTC m=+124.103630833" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.119541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" event={"ID":"9dd58353-6603-48cf-9767-11235ba23164","Type":"ContainerStarted","Data":"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.119572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" event={"ID":"9dd58353-6603-48cf-9767-11235ba23164","Type":"ContainerStarted","Data":"4053189cfb469a78b598614688c7738fe9b5e50e5b6d6f04de4a6e3f498e6ae1"} Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.119587 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.130292 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8gzk" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.138777 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-7c84bc48c-99p72"] Apr 06 11:59:05 crc kubenswrapper[4790]: W0406 11:59:05.155282 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc03077c_04a5_4562_b371_78270cf891ac.slice/crio-8ed513715a10badd4d02bee2e84210ec64bea6d4de5821b97ad4940fd7184d38 WatchSource:0}: Error finding container 8ed513715a10badd4d02bee2e84210ec64bea6d4de5821b97ad4940fd7184d38: Status 404 returned error can't find the container with id 8ed513715a10badd4d02bee2e84210ec64bea6d4de5821b97ad4940fd7184d38 Apr 06 11:59:05 crc kubenswrapper[4790]: W0406 11:59:05.164323 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58da2fd1_160f_42ef_b4a0_18dbe05a2cf3.slice/crio-67dbaf3926be0e77c47ec3f7c5f78fe16d8d94957d2be10e5c653f35dd52403d WatchSource:0}: Error finding container 67dbaf3926be0e77c47ec3f7c5f78fe16d8d94957d2be10e5c653f35dd52403d: Status 404 returned error can't find the container with id 67dbaf3926be0e77c47ec3f7c5f78fe16d8d94957d2be10e5c653f35dd52403d Apr 06 11:59:05 crc kubenswrapper[4790]: E0406 11:59:05.257892 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:05 crc kubenswrapper[4790]: E0406 11:59:05.267681 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.278425 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" podStartSLOduration=89.278399734 podStartE2EDuration="1m29.278399734s" podCreationTimestamp="2026-04-06 11:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:05.260503317 +0000 UTC m=+124.248246203" watchObservedRunningTime="2026-04-06 11:59:05.278399734 +0000 UTC m=+124.266142600" Apr 06 11:59:05 crc kubenswrapper[4790]: E0406 11:59:05.300780 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:05 crc kubenswrapper[4790]: E0406 11:59:05.300859 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.342370 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.357068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.362812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.374402 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.433007 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Apr 06 11:59:05 crc kubenswrapper[4790]: W0406 11:59:05.486442 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod44a07230_1da3_4f52_a189_73910be65825.slice/crio-4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5 WatchSource:0}: Error finding container 4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5: Status 404 returned error can't find the container with id 4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5 Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.518500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.518589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.518705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9zzq\" (UniqueName: \"kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.619898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.619969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9zzq\" (UniqueName: \"kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.620010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.620562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.620821 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.644797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9zzq\" (UniqueName: \"kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq\") pod \"redhat-operators-mlwwf\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.684807 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee57c19-af8c-4e40-ae89-9cfb07ad24d7" path="/var/lib/kubelet/pods/5ee57c19-af8c-4e40-ae89-9cfb07ad24d7/volumes" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.687283 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.710497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.725923 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.728764 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.729797 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.748105 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.825704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.825821 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.825883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdbgc\" (UniqueName: \"kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.926762 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.926820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.926884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdbgc\" (UniqueName: \"kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.927620 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.927861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:05 crc kubenswrapper[4790]: I0406 11:59:05.952899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdbgc\" (UniqueName: \"kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc\") pod \"redhat-operators-gr22k\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.055298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.094041 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:06 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:06 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:06 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.094398 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.130075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-7c84bc48c-99p72" event={"ID":"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3","Type":"ContainerStarted","Data":"5c93b8514e760b1cdb7c2b7b0b3a9efb5e87f1c53a26785bfd80cd19e3160359"} Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.130150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-7c84bc48c-99p72" event={"ID":"58da2fd1-160f-42ef-b4a0-18dbe05a2cf3","Type":"ContainerStarted","Data":"67dbaf3926be0e77c47ec3f7c5f78fe16d8d94957d2be10e5c653f35dd52403d"} Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.132641 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"44a07230-1da3-4f52-a189-73910be65825","Type":"ContainerStarted","Data":"4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5"} Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.148070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.148045883 podStartE2EDuration="1.148045883s" podCreationTimestamp="2026-04-06 11:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:05.790715823 +0000 UTC m=+124.778458689" watchObservedRunningTime="2026-04-06 11:59:06.148045883 +0000 UTC m=+125.135788749" Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.150286 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-7c84bc48c-99p72" podStartSLOduration=2.150276379 podStartE2EDuration="2.150276379s" podCreationTimestamp="2026-04-06 11:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:06.14670258 +0000 UTC m=+125.134445446" watchObservedRunningTime="2026-04-06 11:59:06.150276379 +0000 UTC m=+125.138019245" Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.163210 4790 generic.go:334] "Generic (PLEG): container finished" podID="cc03077c-04a5-4562-b371-78270cf891ac" containerID="0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0" exitCode=0 Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.163482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerDied","Data":"0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0"} Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.163516 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerStarted","Data":"8ed513715a10badd4d02bee2e84210ec64bea6d4de5821b97ad4940fd7184d38"} Apr 06 11:59:06 crc kubenswrapper[4790]: I0406 11:59:06.205639 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 11:59:06 crc kubenswrapper[4790]: W0406 11:59:06.341767 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4be7580_8cec_4726_940d_36fb8575b791.slice/crio-270b28f6fe312490fbfa019515114a0a786ec882422dbfe1b890cf6727e2aa45 WatchSource:0}: Error finding container 270b28f6fe312490fbfa019515114a0a786ec882422dbfe1b890cf6727e2aa45: Status 404 returned error can't find the container with id 270b28f6fe312490fbfa019515114a0a786ec882422dbfe1b890cf6727e2aa45 Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.102302 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:07 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:07 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:07 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.102678 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.169116 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.185928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerStarted","Data":"270b28f6fe312490fbfa019515114a0a786ec882422dbfe1b890cf6727e2aa45"} Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.190077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"44a07230-1da3-4f52-a189-73910be65825","Type":"ContainerStarted","Data":"dbfb8bb9c6747a25f1c207b5473a5a1be310871a3e02caa2420eb8f18ac39271"} Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.219361 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.219337916 podStartE2EDuration="3.219337916s" podCreationTimestamp="2026-04-06 11:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:07.217329376 +0000 UTC m=+126.205072232" watchObservedRunningTime="2026-04-06 11:59:07.219337916 +0000 UTC m=+126.207080792" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.268989 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.445235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir\") pod \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.445371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access\") pod \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\" (UID: \"dbf7adf1-0126-47d9-8ddf-e4d329eda260\") " Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.445373 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbf7adf1-0126-47d9-8ddf-e4d329eda260" (UID: "dbf7adf1-0126-47d9-8ddf-e4d329eda260"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.445585 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.452513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbf7adf1-0126-47d9-8ddf-e4d329eda260" (UID: "dbf7adf1-0126-47d9-8ddf-e4d329eda260"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.546722 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbf7adf1-0126-47d9-8ddf-e4d329eda260-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:07 crc kubenswrapper[4790]: I0406 11:59:07.800523 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58332: no serving certificate available for the kubelet" Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.092213 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:08 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:08 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:08 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.092269 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.210886 4790 generic.go:334] "Generic (PLEG): container finished" podID="44a07230-1da3-4f52-a189-73910be65825" containerID="dbfb8bb9c6747a25f1c207b5473a5a1be310871a3e02caa2420eb8f18ac39271" exitCode=0 Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.211211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"44a07230-1da3-4f52-a189-73910be65825","Type":"ContainerDied","Data":"dbfb8bb9c6747a25f1c207b5473a5a1be310871a3e02caa2420eb8f18ac39271"} Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.218935 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58334: no serving certificate available for the kubelet" Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.237505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerStarted","Data":"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a"} Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.237550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerStarted","Data":"90253820fa37c973dfb4a2d5e58563587a90f2dfb2258dbcaf6bcd1653f9abbc"} Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.240853 4790 generic.go:334] "Generic (PLEG): container finished" podID="c4be7580-8cec-4726-940d-36fb8575b791" containerID="e24b7222e26715c0c52ebb3b061305cc5878640a474acc1ceae7935cf4c4df60" exitCode=0 Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.240950 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerDied","Data":"e24b7222e26715c0c52ebb3b061305cc5878640a474acc1ceae7935cf4c4df60"} Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.243609 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dbf7adf1-0126-47d9-8ddf-e4d329eda260","Type":"ContainerDied","Data":"563bc31d634579ac83d8eb1ed14e07c43faf0ed5b49f804e6dd5c7c5c613e3b0"} Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.243642 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563bc31d634579ac83d8eb1ed14e07c43faf0ed5b49f804e6dd5c7c5c613e3b0" Apr 06 11:59:08 crc kubenswrapper[4790]: I0406 11:59:08.243672 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.092655 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:09 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:09 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:09 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.092713 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.268754 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerID="f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a" exitCode=0 Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.269809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerDied","Data":"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a"} Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.278287 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.286211 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.815799 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.892091 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.994162 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir\") pod \"44a07230-1da3-4f52-a189-73910be65825\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.994275 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access\") pod \"44a07230-1da3-4f52-a189-73910be65825\" (UID: \"44a07230-1da3-4f52-a189-73910be65825\") " Apr 06 11:59:09 crc kubenswrapper[4790]: I0406 11:59:09.994821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "44a07230-1da3-4f52-a189-73910be65825" (UID: "44a07230-1da3-4f52-a189-73910be65825"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.001224 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "44a07230-1da3-4f52-a189-73910be65825" (UID: "44a07230-1da3-4f52-a189-73910be65825"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.099957 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44a07230-1da3-4f52-a189-73910be65825-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.099988 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44a07230-1da3-4f52-a189-73910be65825-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.100150 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:10 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:10 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:10 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.100180 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.282018 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fpjz5" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.290979 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.293475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"44a07230-1da3-4f52-a189-73910be65825","Type":"ContainerDied","Data":"4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5"} Apr 06 11:59:10 crc kubenswrapper[4790]: I0406 11:59:10.293534 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3c3d2628dabca54fe5fad5f63688e156fefe09b430d9c1b688fc4d819e17d5" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.092117 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:11 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:11 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:11 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.092500 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.619857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 11:59:11 crc kubenswrapper[4790]: E0406 11:59:11.620146 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf7adf1-0126-47d9-8ddf-e4d329eda260" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.620160 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf7adf1-0126-47d9-8ddf-e4d329eda260" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: E0406 11:59:11.620177 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a07230-1da3-4f52-a189-73910be65825" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.620185 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a07230-1da3-4f52-a189-73910be65825" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.620312 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf7adf1-0126-47d9-8ddf-e4d329eda260" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.620325 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a07230-1da3-4f52-a189-73910be65825" containerName="pruner" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.620798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.628952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.628998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.629019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.629045 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsmcb\" (UniqueName: \"kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.629129 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.629166 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.629189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.639140 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730324 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730366 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730412 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsmcb\" (UniqueName: \"kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.730464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.731358 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.731632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.731935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.732483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.751108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsmcb\" (UniqueName: \"kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.751719 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.754652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert\") pod \"console-6db6cf4595-5zmct\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:11 crc kubenswrapper[4790]: I0406 11:59:11.951308 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:12 crc kubenswrapper[4790]: I0406 11:59:12.095159 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:12 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:12 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:12 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:12 crc kubenswrapper[4790]: I0406 11:59:12.095231 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:13 crc kubenswrapper[4790]: I0406 11:59:13.094444 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:13 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:13 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:13 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:13 crc kubenswrapper[4790]: I0406 11:59:13.095052 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:14 crc kubenswrapper[4790]: I0406 11:59:14.092617 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:14 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:14 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:14 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:14 crc kubenswrapper[4790]: I0406 11:59:14.093817 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:14 crc kubenswrapper[4790]: I0406 11:59:14.315763 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ql4f9" Apr 06 11:59:14 crc kubenswrapper[4790]: I0406 11:59:14.680997 4790 patch_prober.go:28] interesting pod/console-f9d7485db-dkg9c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Apr 06 11:59:14 crc kubenswrapper[4790]: I0406 11:59:14.681069 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dkg9c" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Apr 06 11:59:15 crc kubenswrapper[4790]: I0406 11:59:15.092621 4790 patch_prober.go:28] interesting pod/router-default-5444994796-895qp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Apr 06 11:59:15 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Apr 06 11:59:15 crc kubenswrapper[4790]: [+]process-running ok Apr 06 11:59:15 crc kubenswrapper[4790]: healthz check failed Apr 06 11:59:15 crc kubenswrapper[4790]: I0406 11:59:15.092696 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-895qp" podUID="48539430-665f-4976-8ab6-7f06a26b9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 11:59:15 crc kubenswrapper[4790]: E0406 11:59:15.251673 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:15 crc kubenswrapper[4790]: E0406 11:59:15.253630 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:15 crc kubenswrapper[4790]: E0406 11:59:15.255419 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:15 crc kubenswrapper[4790]: E0406 11:59:15.255500 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:16 crc kubenswrapper[4790]: I0406 11:59:16.093812 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:59:16 crc kubenswrapper[4790]: I0406 11:59:16.099334 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-895qp" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.000346 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.000623 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" podUID="78a77943-529d-499f-bf5b-3c2dd7451089" containerName="controller-manager" containerID="cri-o://c2ff9f9afbaad6c12b6e19cba2b11a132bc9e9f3bee50e1b26007f368001d036" gracePeriod=30 Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.051206 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.051819 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerName="route-controller-manager" containerID="cri-o://aefb2d3634473472176473549965b9b8eff15ec9ba7adc8b00e32224cc1995f1" gracePeriod=30 Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.099334 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.531203 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-965d55b94-4rql6"] Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.531883 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.563762 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-965d55b94-4rql6"] Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566244 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566402 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmvtd\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566503 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.566535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.611079 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmvtd\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.668810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.670422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.675038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.678348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.689623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.690100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmvtd\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.695577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls\") pod \"image-registry-965d55b94-4rql6\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:18 crc kubenswrapper[4790]: I0406 11:59:18.854394 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.393638 4790 generic.go:334] "Generic (PLEG): container finished" podID="78a77943-529d-499f-bf5b-3c2dd7451089" containerID="c2ff9f9afbaad6c12b6e19cba2b11a132bc9e9f3bee50e1b26007f368001d036" exitCode=0 Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.393717 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" event={"ID":"78a77943-529d-499f-bf5b-3c2dd7451089","Type":"ContainerDied","Data":"c2ff9f9afbaad6c12b6e19cba2b11a132bc9e9f3bee50e1b26007f368001d036"} Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.401361 4790 generic.go:334] "Generic (PLEG): container finished" podID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerID="aefb2d3634473472176473549965b9b8eff15ec9ba7adc8b00e32224cc1995f1" exitCode=0 Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.401406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" event={"ID":"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d","Type":"ContainerDied","Data":"aefb2d3634473472176473549965b9b8eff15ec9ba7adc8b00e32224cc1995f1"} Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.952106 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.959621 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.980315 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.990163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:19 crc kubenswrapper[4790]: I0406 11:59:19.990198 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129352 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129540 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzfb\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.129962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.130101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.130184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.180074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.235706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.231721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.235892 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.236916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.237015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzfb\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.237183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.237303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.237354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.238128 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.238191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.242086 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.242117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.265417 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzfb\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.266777 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token\") pod \"image-registry-686fc65c-fdzvb\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:20 crc kubenswrapper[4790]: I0406 11:59:20.345922 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.529041 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.569353 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:21 crc kubenswrapper[4790]: E0406 11:59:21.569879 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a77943-529d-499f-bf5b-3c2dd7451089" containerName="controller-manager" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.569900 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a77943-529d-499f-bf5b-3c2dd7451089" containerName="controller-manager" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.570173 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a77943-529d-499f-bf5b-3c2dd7451089" containerName="controller-manager" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.571160 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.580386 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.659803 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfg5c\" (UniqueName: \"kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c\") pod \"78a77943-529d-499f-bf5b-3c2dd7451089\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.659996 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config\") pod \"78a77943-529d-499f-bf5b-3c2dd7451089\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660027 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca\") pod \"78a77943-529d-499f-bf5b-3c2dd7451089\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles\") pod \"78a77943-529d-499f-bf5b-3c2dd7451089\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660321 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert\") pod \"78a77943-529d-499f-bf5b-3c2dd7451089\" (UID: \"78a77943-529d-499f-bf5b-3c2dd7451089\") " Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660581 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660607 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660717 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.660737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6267c\" (UniqueName: \"kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.661051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config" (OuterVolumeSpecName: "config") pod "78a77943-529d-499f-bf5b-3c2dd7451089" (UID: "78a77943-529d-499f-bf5b-3c2dd7451089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.661428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "78a77943-529d-499f-bf5b-3c2dd7451089" (UID: "78a77943-529d-499f-bf5b-3c2dd7451089"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.661505 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca" (OuterVolumeSpecName: "client-ca") pod "78a77943-529d-499f-bf5b-3c2dd7451089" (UID: "78a77943-529d-499f-bf5b-3c2dd7451089"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.667467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78a77943-529d-499f-bf5b-3c2dd7451089" (UID: "78a77943-529d-499f-bf5b-3c2dd7451089"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.667626 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c" (OuterVolumeSpecName: "kube-api-access-wfg5c") pod "78a77943-529d-499f-bf5b-3c2dd7451089" (UID: "78a77943-529d-499f-bf5b-3c2dd7451089"). InnerVolumeSpecName "kube-api-access-wfg5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.763193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6267c\" (UniqueName: \"kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.763626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.764888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.764983 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.766387 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.766434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.766503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768861 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78a77943-529d-499f-bf5b-3c2dd7451089-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768875 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfg5c\" (UniqueName: \"kubernetes.io/projected/78a77943-529d-499f-bf5b-3c2dd7451089-kube-api-access-wfg5c\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768886 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768894 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.768903 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78a77943-529d-499f-bf5b-3c2dd7451089-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.774577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.779743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6267c\" (UniqueName: \"kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c\") pod \"controller-manager-59fdf8df58-nhpm9\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.897134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 11:59:21 crc kubenswrapper[4790]: I0406 11:59:21.906510 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:22 crc kubenswrapper[4790]: I0406 11:59:22.423223 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" event={"ID":"78a77943-529d-499f-bf5b-3c2dd7451089","Type":"ContainerDied","Data":"06667daa053be4288a05ebfb2a5b9e61116509d26981eca9316f5a84185689e5"} Apr 06 11:59:22 crc kubenswrapper[4790]: I0406 11:59:22.423274 4790 scope.go:117] "RemoveContainer" containerID="c2ff9f9afbaad6c12b6e19cba2b11a132bc9e9f3bee50e1b26007f368001d036" Apr 06 11:59:22 crc kubenswrapper[4790]: I0406 11:59:22.423374 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67ff4c58b7-wktt2" Apr 06 11:59:22 crc kubenswrapper[4790]: I0406 11:59:22.444739 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:22 crc kubenswrapper[4790]: I0406 11:59:22.448424 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67ff4c58b7-wktt2"] Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.026594 4790 patch_prober.go:28] interesting pod/route-controller-manager-648487d594-ktnqn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.026710 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.686611 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a77943-529d-499f-bf5b-3c2dd7451089" path="/var/lib/kubelet/pods/78a77943-529d-499f-bf5b-3c2dd7451089/volumes" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.694639 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.796437 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.798162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.803151 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.806052 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.809923 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.829403 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.829377578 podStartE2EDuration="829.377578ms" podCreationTimestamp="2026-04-06 11:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:23.823321818 +0000 UTC m=+142.811064684" watchObservedRunningTime="2026-04-06 11:59:23.829377578 +0000 UTC m=+142.817120444" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.911316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:23 crc kubenswrapper[4790]: I0406 11:59:23.911671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.014620 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.014815 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.014991 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.048551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.119260 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.686240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:59:24 crc kubenswrapper[4790]: I0406 11:59:24.689549 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 11:59:25 crc kubenswrapper[4790]: E0406 11:59:25.244280 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:25 crc kubenswrapper[4790]: E0406 11:59:25.245702 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:25 crc kubenswrapper[4790]: E0406 11:59:25.247258 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:25 crc kubenswrapper[4790]: E0406 11:59:25.247290 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.318884 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.366887 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:26 crc kubenswrapper[4790]: E0406 11:59:26.367113 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerName="route-controller-manager" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.367124 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerName="route-controller-manager" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.367218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" containerName="route-controller-manager" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.367565 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.376336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.404715 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.405875 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.407591 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.460940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert\") pod \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.460975 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvq4\" (UniqueName: \"kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4\") pod \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca\") pod \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461055 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config\") pod \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\" (UID: \"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d\") " Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461225 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgfnm\" (UniqueName: \"kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.461287 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.462525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config" (OuterVolumeSpecName: "config") pod "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" (UID: "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.463282 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" (UID: "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.467914 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" (UID: "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.472000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db6cf4595-5zmct" event={"ID":"42775b02-6f50-4862-ae70-7cdb1800baa7","Type":"ContainerStarted","Data":"77c17226fb6e866918ae83a6fe99966b7b948bb7bb891e2749b73bb09f2a63be"} Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.473763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4" (OuterVolumeSpecName: "kube-api-access-wcvq4") pod "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" (UID: "ee7ec9fd-a4c5-4a68-9750-708bb3fa876d"). InnerVolumeSpecName "kube-api-access-wcvq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.475709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" event={"ID":"ee7ec9fd-a4c5-4a68-9750-708bb3fa876d","Type":"ContainerDied","Data":"3527da482926eda6bf451666c3e4d03e56309626afb7d262ff32acc3d421427a"} Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.475844 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.528896 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.532722 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648487d594-ktnqn"] Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.562985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgfnm\" (UniqueName: \"kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563326 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563341 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvq4\" (UniqueName: \"kubernetes.io/projected/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-kube-api-access-wcvq4\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563356 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.563364 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.565093 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.565427 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.583437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.587289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgfnm\" (UniqueName: \"kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm\") pod \"route-controller-manager-6ff6c875d7-lrkgf\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.665547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.665620 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.665654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.665763 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.665751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.682379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access\") pod \"installer-10-crc\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.695658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:26 crc kubenswrapper[4790]: I0406 11:59:26.730939 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 11:59:27 crc kubenswrapper[4790]: I0406 11:59:27.684174 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7ec9fd-a4c5-4a68-9750-708bb3fa876d" path="/var/lib/kubelet/pods/ee7ec9fd-a4c5-4a68-9750-708bb3fa876d/volumes" Apr 06 11:59:28 crc kubenswrapper[4790]: I0406 11:59:28.723567 4790 ???:1] "http: TLS handshake error from 192.168.126.11:58130: no serving certificate available for the kubelet" Apr 06 11:59:31 crc kubenswrapper[4790]: I0406 11:59:31.508021 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lklpx_e1aab57b-c8ed-492c-91d3-e190449713a0/kube-multus-additional-cni-plugins/0.log" Apr 06 11:59:31 crc kubenswrapper[4790]: I0406 11:59:31.508703 4790 generic.go:334] "Generic (PLEG): container finished" podID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" exitCode=137 Apr 06 11:59:31 crc kubenswrapper[4790]: I0406 11:59:31.508739 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" event={"ID":"e1aab57b-c8ed-492c-91d3-e190449713a0","Type":"ContainerDied","Data":"93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0"} Apr 06 11:59:32 crc kubenswrapper[4790]: I0406 11:59:32.710140 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Apr 06 11:59:34 crc kubenswrapper[4790]: I0406 11:59:34.322727 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5d75s" Apr 06 11:59:34 crc kubenswrapper[4790]: I0406 11:59:34.354375 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.354356187 podStartE2EDuration="2.354356187s" podCreationTimestamp="2026-04-06 11:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:34.350235548 +0000 UTC m=+153.337978414" watchObservedRunningTime="2026-04-06 11:59:34.354356187 +0000 UTC m=+153.342099053" Apr 06 11:59:35 crc kubenswrapper[4790]: E0406 11:59:35.241689 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0 is running failed: container process not found" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:35 crc kubenswrapper[4790]: E0406 11:59:35.242101 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0 is running failed: container process not found" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:35 crc kubenswrapper[4790]: E0406 11:59:35.242529 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0 is running failed: container process not found" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 11:59:35 crc kubenswrapper[4790]: E0406 11:59:35.242569 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.748159 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.748808 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.751909 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-5vhrm" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.752676 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.764964 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.921073 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.921154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:35 crc kubenswrapper[4790]: I0406 11:59:35.921203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.023188 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.023303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.023333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.023963 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.024015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.046515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access\") pod \"installer-7-crc\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:36 crc kubenswrapper[4790]: I0406 11:59:36.076523 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.834530 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.835376 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.838241 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.838299 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.841395 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 06 11:59:37 crc kubenswrapper[4790]: E0406 11:59:37.847782 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Apr 06 11:59:37 crc kubenswrapper[4790]: E0406 11:59:37.848198 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5ndv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-42rgx_openshift-marketplace(a30678d8-35eb-4863-a856-096864c2a9b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 06 11:59:37 crc kubenswrapper[4790]: E0406 11:59:37.851124 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-42rgx" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.949575 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.949647 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:37 crc kubenswrapper[4790]: I0406 11:59:37.968471 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.056353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.056557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.057060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.061691 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.080289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:38 crc kubenswrapper[4790]: I0406 11:59:38.159224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:39 crc kubenswrapper[4790]: E0406 11:59:39.975899 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-42rgx" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" Apr 06 11:59:39 crc kubenswrapper[4790]: I0406 11:59:39.988420 4790 scope.go:117] "RemoveContainer" containerID="aefb2d3634473472176473549965b9b8eff15ec9ba7adc8b00e32224cc1995f1" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.046596 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lklpx_e1aab57b-c8ed-492c-91d3-e190449713a0/kube-multus-additional-cni-plugins/0.log" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.046660 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.086037 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.086183 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spzc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-klccw_openshift-marketplace(cc03077c-04a5-4562-b371-78270cf891ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.087338 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-klccw" podUID="cc03077c-04a5-4562-b371-78270cf891ac" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.121761 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.122567 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdm8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-msclk_openshift-marketplace(b9685a39-63cf-47d3-b5fe-9113d55676d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.123765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-msclk" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.190795 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9cpp\" (UniqueName: \"kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp\") pod \"e1aab57b-c8ed-492c-91d3-e190449713a0\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.192459 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir\") pod \"e1aab57b-c8ed-492c-91d3-e190449713a0\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.192506 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready\") pod \"e1aab57b-c8ed-492c-91d3-e190449713a0\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.192534 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist\") pod \"e1aab57b-c8ed-492c-91d3-e190449713a0\" (UID: \"e1aab57b-c8ed-492c-91d3-e190449713a0\") " Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.193614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "e1aab57b-c8ed-492c-91d3-e190449713a0" (UID: "e1aab57b-c8ed-492c-91d3-e190449713a0"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.194586 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "e1aab57b-c8ed-492c-91d3-e190449713a0" (UID: "e1aab57b-c8ed-492c-91d3-e190449713a0"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.194879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready" (OuterVolumeSpecName: "ready") pod "e1aab57b-c8ed-492c-91d3-e190449713a0" (UID: "e1aab57b-c8ed-492c-91d3-e190449713a0"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.202815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp" (OuterVolumeSpecName: "kube-api-access-b9cpp") pod "e1aab57b-c8ed-492c-91d3-e190449713a0" (UID: "e1aab57b-c8ed-492c-91d3-e190449713a0"). InnerVolumeSpecName "kube-api-access-b9cpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.211703 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.211820 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn94d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2hfrm_openshift-marketplace(c176f8e8-2902-4da6-b779-d0426b68e715): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.213257 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2hfrm" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.243436 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.281697 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.282327 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcwbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-96trx_openshift-marketplace(311eb251-79b6-4e1e-a3a7-456322ca133e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.283206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-965d55b94-4rql6"] Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.283502 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-96trx" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.294253 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9cpp\" (UniqueName: \"kubernetes.io/projected/e1aab57b-c8ed-492c-91d3-e190449713a0-kube-api-access-b9cpp\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.294306 4790 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e1aab57b-c8ed-492c-91d3-e190449713a0-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.294315 4790 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e1aab57b-c8ed-492c-91d3-e190449713a0-ready\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.294325 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e1aab57b-c8ed-492c-91d3-e190449713a0-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.588348 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lklpx_e1aab57b-c8ed-492c-91d3-e190449713a0/kube-multus-additional-cni-plugins/0.log" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.589057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" event={"ID":"e1aab57b-c8ed-492c-91d3-e190449713a0","Type":"ContainerDied","Data":"3d2db21fe60acc0e96bf589a8acccac243748b164ea844ea52f96c04f04d7191"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.589194 4790 scope.go:117] "RemoveContainer" containerID="93a66c638ba2914f94bf62a0ddea1824e9a05dbdfcb48e817b621991a52a33b0" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.589622 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lklpx" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.625704 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lklpx"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.625923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db6cf4595-5zmct" event={"ID":"42775b02-6f50-4862-ae70-7cdb1800baa7","Type":"ContainerStarted","Data":"b13df9e73e6137a2a5fb643678883a1dd14a11e203af679e2e49c2e630a571ab"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.629951 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lklpx"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.639236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" event={"ID":"8f4ae293-64ab-4efd-a511-16c6e935a2fc","Type":"ContainerStarted","Data":"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.639283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" event={"ID":"8f4ae293-64ab-4efd-a511-16c6e935a2fc","Type":"ContainerStarted","Data":"c847d7e17ecff88945e357162cfdbf46664ecf3a0b4229952c5e99a29aaab611"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.639748 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.653635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerStarted","Data":"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.666393 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerStarted","Data":"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.668192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerStarted","Data":"84fe3b3b02c7d7abe3c7ea943338fc94098471231f7267c7d4ebe02486b34d2b"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.672663 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6db6cf4595-5zmct" podStartSLOduration=29.672644262 podStartE2EDuration="29.672644262s" podCreationTimestamp="2026-04-06 11:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:40.666058538 +0000 UTC m=+159.653801424" watchObservedRunningTime="2026-04-06 11:59:40.672644262 +0000 UTC m=+159.660387128" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.684230 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-965d55b94-4rql6" event={"ID":"1aed6119-fbb5-4557-94ee-7c3e86fc3002","Type":"ContainerStarted","Data":"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f"} Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.684499 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.684613 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-965d55b94-4rql6" event={"ID":"1aed6119-fbb5-4557-94ee-7c3e86fc3002","Type":"ContainerStarted","Data":"a8fcb01c12d9b7c8b5fb5f5259c1b25f9b6afdd1d9349e0d5df4465bc99b37d3"} Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.686118 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-msclk" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.686540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-klccw" podUID="cc03077c-04a5-4562-b371-78270cf891ac" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.686952 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2hfrm" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" Apr 06 11:59:40 crc kubenswrapper[4790]: E0406 11:59:40.687087 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-96trx" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.708167 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-10-crc"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.708579 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.756660 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" podStartSLOduration=21.756640642 podStartE2EDuration="21.756640642s" podCreationTimestamp="2026-04-06 11:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:40.751711022 +0000 UTC m=+159.739453898" watchObservedRunningTime="2026-04-06 11:59:40.756640642 +0000 UTC m=+159.744383508" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.804463 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-965d55b94-4rql6" podStartSLOduration=22.804445686 podStartE2EDuration="22.804445686s" podCreationTimestamp="2026-04-06 11:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:40.801003175 +0000 UTC m=+159.788746061" watchObservedRunningTime="2026-04-06 11:59:40.804445686 +0000 UTC m=+159.792188552" Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.816397 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.827235 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-crc"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.829238 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 11:59:40 crc kubenswrapper[4790]: I0406 11:59:40.880956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.392093 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.684214 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" path="/var/lib/kubelet/pods/e1aab57b-c8ed-492c-91d3-e190449713a0/volumes" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.684987 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.692040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"305eb146-0a27-465e-955c-1c959b522d07","Type":"ContainerStarted","Data":"7f435927732c29ac92d931ce9d7da2582e2b97ba5e47d1305240783e3292cc28"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.692209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"305eb146-0a27-465e-955c-1c959b522d07","Type":"ContainerStarted","Data":"60ec2549078f1ad7144052423422e40101daaba863e25fbf9cd5555e466488f5"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.694107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"ed446ab9-6e6e-4012-a957-c7326a21ef09","Type":"ContainerStarted","Data":"6a814c23422e8487fdb5689c38355ee7cf334bb76c48e2c53b58a74c92bc35fd"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.694203 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"ed446ab9-6e6e-4012-a957-c7326a21ef09","Type":"ContainerStarted","Data":"6b724a3905d99152a98aed1215635e4d2ad646fa6283a74ba922c92b7aba4a13"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.695765 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"e458c724-3942-4d11-80fb-e42973fb2b28","Type":"ContainerStarted","Data":"132764e02cc0c9e76e96cf7b32af3d88039f53361c8d69592efaedf29a5b3eb3"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.696151 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"e458c724-3942-4d11-80fb-e42973fb2b28","Type":"ContainerStarted","Data":"77cf7bb560bc98265a0c61f9bad5660afd5e8aebbbaebdc73b0701bd43354835"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.697352 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"5ed2aa0a-1e35-4436-b184-a680dcfd14df","Type":"ContainerStarted","Data":"246286413f6b7571200bf664027f62be790ab64bf5259adbfe65f009630c0538"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.697384 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"5ed2aa0a-1e35-4436-b184-a680dcfd14df","Type":"ContainerStarted","Data":"0e47180f70daa4179ab73b3c5104c73853df89da894829db9df44a5a5198a67b"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.699703 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerID="a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23" exitCode=0 Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.699801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerDied","Data":"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.702705 4790 generic.go:334] "Generic (PLEG): container finished" podID="c4be7580-8cec-4726-940d-36fb8575b791" containerID="84fe3b3b02c7d7abe3c7ea943338fc94098471231f7267c7d4ebe02486b34d2b" exitCode=0 Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.704163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerDied","Data":"84fe3b3b02c7d7abe3c7ea943338fc94098471231f7267c7d4ebe02486b34d2b"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.710045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" event={"ID":"65389934-50d6-49c5-9fdd-c761a0e16733","Type":"ContainerStarted","Data":"190d205c2cdb8643a0bd6a42fe73690457864036307fafb1f521faf58d33b2a6"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.710085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" event={"ID":"65389934-50d6-49c5-9fdd-c761a0e16733","Type":"ContainerStarted","Data":"2f2c74cbbfd2d0b5454490b4adc90d9ed50496aecb997fbaa68b0bcfdd09179c"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.710173 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" containerName="route-controller-manager" containerID="cri-o://190d205c2cdb8643a0bd6a42fe73690457864036307fafb1f521faf58d33b2a6" gracePeriod=30 Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.710515 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.714722 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" event={"ID":"9d787187-b9bc-428b-9aa4-ff09276e19de","Type":"ContainerStarted","Data":"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.714762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" event={"ID":"9d787187-b9bc-428b-9aa4-ff09276e19de","Type":"ContainerStarted","Data":"27d187bdd5130328741a7c569632c9ea4e2f08ab59a80833fbcd563ccd4a7e29"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.714923 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerName="controller-manager" containerID="cri-o://2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e" gracePeriod=30 Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.715271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.720754 4790 generic.go:334] "Generic (PLEG): container finished" podID="25067a9b-e553-4a7a-abdb-226567079c15" containerID="595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d" exitCode=0 Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.721501 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerDied","Data":"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d"} Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.733094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.862377 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" podStartSLOduration=23.862353827 podStartE2EDuration="23.862353827s" podCreationTimestamp="2026-04-06 11:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:41.861007022 +0000 UTC m=+160.848749918" watchObservedRunningTime="2026-04-06 11:59:41.862353827 +0000 UTC m=+160.850096703" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.908459 4790 patch_prober.go:28] interesting pod/controller-manager-59fdf8df58-nhpm9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.908519 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.919471 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" podStartSLOduration=23.919441226 podStartE2EDuration="23.919441226s" podCreationTimestamp="2026-04-06 11:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:41.905680702 +0000 UTC m=+160.893423568" watchObservedRunningTime="2026-04-06 11:59:41.919441226 +0000 UTC m=+160.907184092" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.953202 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.954048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.963578 4790 patch_prober.go:28] interesting pod/console-6db6cf4595-5zmct container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.57:8443/health\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.963656 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6db6cf4595-5zmct" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.57:8443/health\": dial tcp 10.217.0.57:8443: connect: connection refused" Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.977366 4790 patch_prober.go:28] interesting pod/route-controller-manager-6ff6c875d7-lrkgf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:47276->10.217.0.63:8443: read: connection reset by peer" start-of-body= Apr 06 11:59:41 crc kubenswrapper[4790]: I0406 11:59:41.977889 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": read tcp 10.217.0.2:47276->10.217.0.63:8443: read: connection reset by peer" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.402204 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 06 11:59:42 crc kubenswrapper[4790]: E0406 11:59:42.402626 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.402655 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.402787 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1aab57b-c8ed-492c-91d3-e190449713a0" containerName="kube-multus-additional-cni-plugins" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.425267 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.425506 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.553608 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.553998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.656013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.656091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.656214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.675709 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.715183 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.740544 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6ff6c875d7-lrkgf_65389934-50d6-49c5-9fdd-c761a0e16733/route-controller-manager/0.log" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.740607 4790 generic.go:334] "Generic (PLEG): container finished" podID="65389934-50d6-49c5-9fdd-c761a0e16733" containerID="190d205c2cdb8643a0bd6a42fe73690457864036307fafb1f521faf58d33b2a6" exitCode=255 Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.741335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" event={"ID":"65389934-50d6-49c5-9fdd-c761a0e16733","Type":"ContainerDied","Data":"190d205c2cdb8643a0bd6a42fe73690457864036307fafb1f521faf58d33b2a6"} Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.750687 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerID="2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e" exitCode=0 Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.750787 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" event={"ID":"9d787187-b9bc-428b-9aa4-ff09276e19de","Type":"ContainerDied","Data":"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e"} Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.750845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" event={"ID":"9d787187-b9bc-428b-9aa4-ff09276e19de","Type":"ContainerDied","Data":"27d187bdd5130328741a7c569632c9ea4e2f08ab59a80833fbcd563ccd4a7e29"} Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.750870 4790 scope.go:117] "RemoveContainer" containerID="2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.751047 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59fdf8df58-nhpm9" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.751808 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 11:59:42 crc kubenswrapper[4790]: E0406 11:59:42.752154 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerName="controller-manager" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.752178 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerName="controller-manager" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.752307 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" containerName="controller-manager" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.752873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.757219 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.757626 4790 generic.go:334] "Generic (PLEG): container finished" podID="ed446ab9-6e6e-4012-a957-c7326a21ef09" containerID="6a814c23422e8487fdb5689c38355ee7cf334bb76c48e2c53b58a74c92bc35fd" exitCode=0 Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.757702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"ed446ab9-6e6e-4012-a957-c7326a21ef09","Type":"ContainerDied","Data":"6a814c23422e8487fdb5689c38355ee7cf334bb76c48e2c53b58a74c92bc35fd"} Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.773102 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.773450 4790 generic.go:334] "Generic (PLEG): container finished" podID="305eb146-0a27-465e-955c-1c959b522d07" containerID="7f435927732c29ac92d931ce9d7da2582e2b97ba5e47d1305240783e3292cc28" exitCode=0 Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.773620 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-10-crc" podUID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" containerName="installer" containerID="cri-o://246286413f6b7571200bf664027f62be790ab64bf5259adbfe65f009630c0538" gracePeriod=30 Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.773780 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"305eb146-0a27-465e-955c-1c959b522d07","Type":"ContainerDied","Data":"7f435927732c29ac92d931ce9d7da2582e2b97ba5e47d1305240783e3292cc28"} Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.850719 4790 scope.go:117] "RemoveContainer" containerID="2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.852223 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-10-crc" podStartSLOduration=16.852211518 podStartE2EDuration="16.852211518s" podCreationTimestamp="2026-04-06 11:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:42.85077938 +0000 UTC m=+161.838522246" watchObservedRunningTime="2026-04-06 11:59:42.852211518 +0000 UTC m=+161.839954384" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.852564 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-7-crc" podStartSLOduration=7.852558427 podStartE2EDuration="7.852558427s" podCreationTimestamp="2026-04-06 11:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:42.826714434 +0000 UTC m=+161.814457300" watchObservedRunningTime="2026-04-06 11:59:42.852558427 +0000 UTC m=+161.840301283" Apr 06 11:59:42 crc kubenswrapper[4790]: E0406 11:59:42.858095 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e\": container with ID starting with 2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e not found: ID does not exist" containerID="2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858166 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e"} err="failed to get container status \"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e\": rpc error: code = NotFound desc = could not find container \"2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e\": container with ID starting with 2b15a14e0cb3b2867ba6df98695456ee47b328aefbe5883bd0f255becf8bd49e not found: ID does not exist" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858526 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config\") pod \"9d787187-b9bc-428b-9aa4-ff09276e19de\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858604 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles\") pod \"9d787187-b9bc-428b-9aa4-ff09276e19de\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858632 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca\") pod \"9d787187-b9bc-428b-9aa4-ff09276e19de\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858657 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert\") pod \"9d787187-b9bc-428b-9aa4-ff09276e19de\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.858759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6267c\" (UniqueName: \"kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c\") pod \"9d787187-b9bc-428b-9aa4-ff09276e19de\" (UID: \"9d787187-b9bc-428b-9aa4-ff09276e19de\") " Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859138 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859165 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-699f7\" (UniqueName: \"kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.859959 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9d787187-b9bc-428b-9aa4-ff09276e19de" (UID: "9d787187-b9bc-428b-9aa4-ff09276e19de"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.860568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config" (OuterVolumeSpecName: "config") pod "9d787187-b9bc-428b-9aa4-ff09276e19de" (UID: "9d787187-b9bc-428b-9aa4-ff09276e19de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.860726 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d787187-b9bc-428b-9aa4-ff09276e19de" (UID: "9d787187-b9bc-428b-9aa4-ff09276e19de"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.868449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d787187-b9bc-428b-9aa4-ff09276e19de" (UID: "9d787187-b9bc-428b-9aa4-ff09276e19de"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.876524 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c" (OuterVolumeSpecName: "kube-api-access-6267c") pod "9d787187-b9bc-428b-9aa4-ff09276e19de" (UID: "9d787187-b9bc-428b-9aa4-ff09276e19de"). InnerVolumeSpecName "kube-api-access-6267c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.942862 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6ff6c875d7-lrkgf_65389934-50d6-49c5-9fdd-c761a0e16733/route-controller-manager/0.log" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.942985 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.961714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.961800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-699f7\" (UniqueName: \"kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.961879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.961921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.964041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.964788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965271 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965325 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965340 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d787187-b9bc-428b-9aa4-ff09276e19de-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965349 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d787187-b9bc-428b-9aa4-ff09276e19de-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965580 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.965678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.966349 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6267c\" (UniqueName: \"kubernetes.io/projected/9d787187-b9bc-428b-9aa4-ff09276e19de-kube-api-access-6267c\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.973553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:42 crc kubenswrapper[4790]: I0406 11:59:42.982919 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-699f7\" (UniqueName: \"kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7\") pod \"controller-manager-6c9875f499-4zz67\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.068210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca\") pod \"65389934-50d6-49c5-9fdd-c761a0e16733\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.069022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config\") pod \"65389934-50d6-49c5-9fdd-c761a0e16733\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.069067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgfnm\" (UniqueName: \"kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm\") pod \"65389934-50d6-49c5-9fdd-c761a0e16733\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.069164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert\") pod \"65389934-50d6-49c5-9fdd-c761a0e16733\" (UID: \"65389934-50d6-49c5-9fdd-c761a0e16733\") " Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.069642 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca" (OuterVolumeSpecName: "client-ca") pod "65389934-50d6-49c5-9fdd-c761a0e16733" (UID: "65389934-50d6-49c5-9fdd-c761a0e16733"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.070170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config" (OuterVolumeSpecName: "config") pod "65389934-50d6-49c5-9fdd-c761a0e16733" (UID: "65389934-50d6-49c5-9fdd-c761a0e16733"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.072132 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm" (OuterVolumeSpecName: "kube-api-access-wgfnm") pod "65389934-50d6-49c5-9fdd-c761a0e16733" (UID: "65389934-50d6-49c5-9fdd-c761a0e16733"). InnerVolumeSpecName "kube-api-access-wgfnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.073089 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "65389934-50d6-49c5-9fdd-c761a0e16733" (UID: "65389934-50d6-49c5-9fdd-c761a0e16733"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.076718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-11-crc"] Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.087562 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.125903 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.128784 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59fdf8df58-nhpm9"] Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.151136 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" containerID="cri-o://331d5142c829ac03b3d4ad11b8d556154b95035abe55739872c38fc22dba0c53" gracePeriod=15 Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.172280 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.172320 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgfnm\" (UniqueName: \"kubernetes.io/projected/65389934-50d6-49c5-9fdd-c761a0e16733-kube-api-access-wgfnm\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.172331 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65389934-50d6-49c5-9fdd-c761a0e16733-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.172342 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65389934-50d6-49c5-9fdd-c761a0e16733-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.685491 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d787187-b9bc-428b-9aa4-ff09276e19de" path="/var/lib/kubelet/pods/9d787187-b9bc-428b-9aa4-ff09276e19de/volumes" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.793635 4790 generic.go:334] "Generic (PLEG): container finished" podID="4610d751-50bd-42d4-a947-1c494bcb4096" containerID="331d5142c829ac03b3d4ad11b8d556154b95035abe55739872c38fc22dba0c53" exitCode=0 Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.793717 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" event={"ID":"4610d751-50bd-42d4-a947-1c494bcb4096","Type":"ContainerDied","Data":"331d5142c829ac03b3d4ad11b8d556154b95035abe55739872c38fc22dba0c53"} Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.796577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"59d753b3-c003-45df-bce1-4f22f69b03cd","Type":"ContainerStarted","Data":"2a2f97b026cc7d2d09930f112ae4bb8b986a542e6d5380a303f013dd19eb11a3"} Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.804737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerStarted","Data":"e2f5a41a544a9b4e9d9db199769aa5a6166b001c0e25458aaeb72a312dc09673"} Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.808366 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6ff6c875d7-lrkgf_65389934-50d6-49c5-9fdd-c761a0e16733/route-controller-manager/0.log" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.808473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" event={"ID":"65389934-50d6-49c5-9fdd-c761a0e16733","Type":"ContainerDied","Data":"2f2c74cbbfd2d0b5454490b4adc90d9ed50496aecb997fbaa68b0bcfdd09179c"} Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.808510 4790 scope.go:117] "RemoveContainer" containerID="190d205c2cdb8643a0bd6a42fe73690457864036307fafb1f521faf58d33b2a6" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.808508 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.811994 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 11:59:43 crc kubenswrapper[4790]: W0406 11:59:43.815966 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23255321_dfe3_4e65_977f_25ebfafba56b.slice/crio-546d1ed39951eb3afec5daf52d58b355103682f347bc050e134ba1dc6fc90454 WatchSource:0}: Error finding container 546d1ed39951eb3afec5daf52d58b355103682f347bc050e134ba1dc6fc90454: Status 404 returned error can't find the container with id 546d1ed39951eb3afec5daf52d58b355103682f347bc050e134ba1dc6fc90454 Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.822964 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mlwwf" podStartSLOduration=4.349030873 podStartE2EDuration="38.822949515s" podCreationTimestamp="2026-04-06 11:59:05 +0000 UTC" firstStartedPulling="2026-04-06 11:59:08.243883793 +0000 UTC m=+127.231626669" lastFinishedPulling="2026-04-06 11:59:42.717802445 +0000 UTC m=+161.705545311" observedRunningTime="2026-04-06 11:59:43.820119481 +0000 UTC m=+162.807862367" watchObservedRunningTime="2026-04-06 11:59:43.822949515 +0000 UTC m=+162.810692371" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.839463 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.839546 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ff6c875d7-lrkgf"] Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.996110 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 06 11:59:43 crc kubenswrapper[4790]: E0406 11:59:43.996534 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" containerName="route-controller-manager" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.996561 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" containerName="route-controller-manager" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.996689 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" containerName="route-controller-manager" Apr 06 11:59:43 crc kubenswrapper[4790]: I0406 11:59:43.997342 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.006215 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.045556 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-75kvr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.045638 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.088455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.088517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.088564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.190593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.190656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.190713 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.190791 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.190822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.213365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access\") pod \"installer-11-crc\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.350025 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.370043 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.375564 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.380433 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.494723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.494798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access\") pod \"305eb146-0a27-465e-955c-1c959b522d07\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.494883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir\") pod \"305eb146-0a27-465e-955c-1c959b522d07\" (UID: \"305eb146-0a27-465e-955c-1c959b522d07\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.494959 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.494982 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495034 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access\") pod \"ed446ab9-6e6e-4012-a957-c7326a21ef09\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495377 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495449 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495477 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir\") pod \"ed446ab9-6e6e-4012-a957-c7326a21ef09\" (UID: \"ed446ab9-6e6e-4012-a957-c7326a21ef09\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495526 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495599 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lpct\" (UniqueName: \"kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.495636 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir\") pod \"4610d751-50bd-42d4-a947-1c494bcb4096\" (UID: \"4610d751-50bd-42d4-a947-1c494bcb4096\") " Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.496105 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.496365 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.496374 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.496618 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "305eb146-0a27-465e-955c-1c959b522d07" (UID: "305eb146-0a27-465e-955c-1c959b522d07"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.496716 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.497463 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.497587 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ed446ab9-6e6e-4012-a957-c7326a21ef09" (UID: "ed446ab9-6e6e-4012-a957-c7326a21ef09"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.501103 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.502956 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.507545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "305eb146-0a27-465e-955c-1c959b522d07" (UID: "305eb146-0a27-465e-955c-1c959b522d07"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.507921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ed446ab9-6e6e-4012-a957-c7326a21ef09" (UID: "ed446ab9-6e6e-4012-a957-c7326a21ef09"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.508275 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.508612 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.508668 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct" (OuterVolumeSpecName: "kube-api-access-5lpct") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "kube-api-access-5lpct". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.508891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.509066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.509866 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.510038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4610d751-50bd-42d4-a947-1c494bcb4096" (UID: "4610d751-50bd-42d4-a947-1c494bcb4096"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.597276 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.597632 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.597649 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed446ab9-6e6e-4012-a957-c7326a21ef09-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.597662 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.597673 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lpct\" (UniqueName: \"kubernetes.io/projected/4610d751-50bd-42d4-a947-1c494bcb4096-kube-api-access-5lpct\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598498 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4610d751-50bd-42d4-a947-1c494bcb4096-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598511 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598522 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/305eb146-0a27-465e-955c-1c959b522d07-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598532 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/305eb146-0a27-465e-955c-1c959b522d07-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598543 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598553 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598563 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed446ab9-6e6e-4012-a957-c7326a21ef09-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598573 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598583 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598598 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598608 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598618 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.598628 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4610d751-50bd-42d4-a947-1c494bcb4096-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.714914 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-crc"] Apr 06 11:59:44 crc kubenswrapper[4790]: W0406 11:59:44.725141 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda6523047_6809_4e97_8d1e_4c33d08aa1d6.slice/crio-a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad WatchSource:0}: Error finding container a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad: Status 404 returned error can't find the container with id a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.827888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" event={"ID":"4610d751-50bd-42d4-a947-1c494bcb4096","Type":"ContainerDied","Data":"320302ab91ca339776f071b8b1d515565ff938e4f78e807de094324803d6b283"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.827955 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-75kvr" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.827977 4790 scope.go:117] "RemoveContainer" containerID="331d5142c829ac03b3d4ad11b8d556154b95035abe55739872c38fc22dba0c53" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.830105 4790 generic.go:334] "Generic (PLEG): container finished" podID="59d753b3-c003-45df-bce1-4f22f69b03cd" containerID="dcc89d03ec50efa1d0683ab59a94d25f354860ce8676155e87f70628b30a505e" exitCode=0 Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.830181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"59d753b3-c003-45df-bce1-4f22f69b03cd","Type":"ContainerDied","Data":"dcc89d03ec50efa1d0683ab59a94d25f354860ce8676155e87f70628b30a505e"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.834002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerStarted","Data":"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.836344 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"a6523047-6809-4e97-8d1e-4c33d08aa1d6","Type":"ContainerStarted","Data":"a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.845003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"305eb146-0a27-465e-955c-1c959b522d07","Type":"ContainerDied","Data":"60ec2549078f1ad7144052423422e40101daaba863e25fbf9cd5555e466488f5"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.845080 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ec2549078f1ad7144052423422e40101daaba863e25fbf9cd5555e466488f5" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.845243 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.851783 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-10-crc" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.852028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-10-crc" event={"ID":"ed446ab9-6e6e-4012-a957-c7326a21ef09","Type":"ContainerDied","Data":"6b724a3905d99152a98aed1215635e4d2ad646fa6283a74ba922c92b7aba4a13"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.852108 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b724a3905d99152a98aed1215635e4d2ad646fa6283a74ba922c92b7aba4a13" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.875076 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" event={"ID":"23255321-dfe3-4e65-977f-25ebfafba56b","Type":"ContainerStarted","Data":"4f1d9ccfe4cd5666af9926cfa880fe1aac1fe1f8077b6b6260fc7f6fb13dd06e"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.875140 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" event={"ID":"23255321-dfe3-4e65-977f-25ebfafba56b","Type":"ContainerStarted","Data":"546d1ed39951eb3afec5daf52d58b355103682f347bc050e134ba1dc6fc90454"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.882652 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.883435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gr22k" podStartSLOduration=4.674030806 podStartE2EDuration="39.883412954s" podCreationTimestamp="2026-04-06 11:59:05 +0000 UTC" firstStartedPulling="2026-04-06 11:59:09.271316722 +0000 UTC m=+128.259059578" lastFinishedPulling="2026-04-06 11:59:44.48069886 +0000 UTC m=+163.468441726" observedRunningTime="2026-04-06 11:59:44.88096475 +0000 UTC m=+163.868707636" watchObservedRunningTime="2026-04-06 11:59:44.883412954 +0000 UTC m=+163.871155820" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.886926 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.907100 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerStarted","Data":"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143"} Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.907656 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.924214 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" podStartSLOduration=7.924195582 podStartE2EDuration="7.924195582s" podCreationTimestamp="2026-04-06 11:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:44.91654969 +0000 UTC m=+163.904292556" watchObservedRunningTime="2026-04-06 11:59:44.924195582 +0000 UTC m=+163.911938448" Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.924569 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-75kvr"] Apr 06 11:59:44 crc kubenswrapper[4790]: I0406 11:59:44.946864 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4mfnk" podStartSLOduration=3.43636411 podStartE2EDuration="42.946848211s" podCreationTimestamp="2026-04-06 11:59:02 +0000 UTC" firstStartedPulling="2026-04-06 11:59:04.056512629 +0000 UTC m=+123.044255495" lastFinishedPulling="2026-04-06 11:59:43.56699673 +0000 UTC m=+162.554739596" observedRunningTime="2026-04-06 11:59:44.944228062 +0000 UTC m=+163.931970928" watchObservedRunningTime="2026-04-06 11:59:44.946848211 +0000 UTC m=+163.934591077" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.028373 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 06 11:59:45 crc kubenswrapper[4790]: E0406 11:59:45.028972 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.028985 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" Apr 06 11:59:45 crc kubenswrapper[4790]: E0406 11:59:45.028997 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305eb146-0a27-465e-955c-1c959b522d07" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029003 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="305eb146-0a27-465e-955c-1c959b522d07" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: E0406 11:59:45.029014 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed446ab9-6e6e-4012-a957-c7326a21ef09" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029022 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed446ab9-6e6e-4012-a957-c7326a21ef09" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029120 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" containerName="oauth-openshift" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029131 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="305eb146-0a27-465e-955c-1c959b522d07" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029140 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed446ab9-6e6e-4012-a957-c7326a21ef09" containerName="pruner" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.029491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.034354 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.034656 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.043019 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.058652 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" podUID="9dd58353-6603-48cf-9767-11235ba23164" containerName="registry" containerID="cri-o://c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69" gracePeriod=30 Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.110030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.110126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.110214 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.212304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.212409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.212541 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.212689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.212973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.231887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access\") pod \"installer-9-crc\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.503443 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.709584 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4610d751-50bd-42d4-a947-1c494bcb4096" path="/var/lib/kubelet/pods/4610d751-50bd-42d4-a947-1c494bcb4096/volumes" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.710920 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65389934-50d6-49c5-9fdd-c761a0e16733" path="/var/lib/kubelet/pods/65389934-50d6-49c5-9fdd-c761a0e16733/volumes" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.735209 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.735294 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.742979 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823613 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823635 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823700 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.823746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxxpq\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.824028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.824068 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted\") pod \"9dd58353-6603-48cf-9767-11235ba23164\" (UID: \"9dd58353-6603-48cf-9767-11235ba23164\") " Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.828578 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.829023 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.834317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq" (OuterVolumeSpecName: "kube-api-access-nxxpq") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "kube-api-access-nxxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.836002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.836443 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.837294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.848540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.855771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9dd58353-6603-48cf-9767-11235ba23164" (UID: "9dd58353-6603-48cf-9767-11235ba23164"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.916974 4790 generic.go:334] "Generic (PLEG): container finished" podID="9dd58353-6603-48cf-9767-11235ba23164" containerID="c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69" exitCode=0 Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.917044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" event={"ID":"9dd58353-6603-48cf-9767-11235ba23164","Type":"ContainerDied","Data":"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69"} Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.917072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" event={"ID":"9dd58353-6603-48cf-9767-11235ba23164","Type":"ContainerDied","Data":"4053189cfb469a78b598614688c7738fe9b5e50e5b6d6f04de4a6e3f498e6ae1"} Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.917088 4790 scope.go:117] "RemoveContainer" containerID="c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.917176 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-l7zk6" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925890 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925934 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925946 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925957 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9dd58353-6603-48cf-9767-11235ba23164-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925967 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9dd58353-6603-48cf-9767-11235ba23164-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925976 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxxpq\" (UniqueName: \"kubernetes.io/projected/9dd58353-6603-48cf-9767-11235ba23164-kube-api-access-nxxpq\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.925987 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9dd58353-6603-48cf-9767-11235ba23164-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.929806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"a6523047-6809-4e97-8d1e-4c33d08aa1d6","Type":"ContainerStarted","Data":"c4d7c6173eca80a218789d788426e3cbe8a4dd292a3cea6e1dd8ee24cb8e5437"} Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.951081 4790 scope.go:117] "RemoveContainer" containerID="c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69" Apr 06 11:59:45 crc kubenswrapper[4790]: E0406 11:59:45.956210 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69\": container with ID starting with c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69 not found: ID does not exist" containerID="c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.956284 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69"} err="failed to get container status \"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69\": rpc error: code = NotFound desc = could not find container \"c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69\": container with ID starting with c60871dc3aa14258ba9c1ff1ac3ffe3cb06ddb96a1e2144c9c1ba0103d28ec69 not found: ID does not exist" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.971737 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-11-crc" podStartSLOduration=2.971640776 podStartE2EDuration="2.971640776s" podCreationTimestamp="2026-04-06 11:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:45.951282038 +0000 UTC m=+164.939024904" watchObservedRunningTime="2026-04-06 11:59:45.971640776 +0000 UTC m=+164.959383642" Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.974190 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:59:45 crc kubenswrapper[4790]: I0406 11:59:45.977570 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-l7zk6"] Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.056970 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.057021 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.108848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.250779 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.331450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access\") pod \"59d753b3-c003-45df-bce1-4f22f69b03cd\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.331500 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir\") pod \"59d753b3-c003-45df-bce1-4f22f69b03cd\" (UID: \"59d753b3-c003-45df-bce1-4f22f69b03cd\") " Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.331815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "59d753b3-c003-45df-bce1-4f22f69b03cd" (UID: "59d753b3-c003-45df-bce1-4f22f69b03cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.338698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "59d753b3-c003-45df-bce1-4f22f69b03cd" (UID: "59d753b3-c003-45df-bce1-4f22f69b03cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.433269 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/59d753b3-c003-45df-bce1-4f22f69b03cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.433306 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59d753b3-c003-45df-bce1-4f22f69b03cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.939593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"804d6679-7288-40ec-853e-345cf118c657","Type":"ContainerStarted","Data":"b1ee3c31cfe2be7da725700eb5fb3726425e8a36157526c1852526ee9aa3e340"} Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.939875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"804d6679-7288-40ec-853e-345cf118c657","Type":"ContainerStarted","Data":"3a362273248972e6cd4b92c064f0186c29146fe85b5e664f4fa3a63fbaa77988"} Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.943322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-11-crc" event={"ID":"59d753b3-c003-45df-bce1-4f22f69b03cd","Type":"ContainerDied","Data":"2a2f97b026cc7d2d09930f112ae4bb8b986a542e6d5380a303f013dd19eb11a3"} Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.943368 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2f97b026cc7d2d09930f112ae4bb8b986a542e6d5380a303f013dd19eb11a3" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.943389 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-11-crc" Apr 06 11:59:46 crc kubenswrapper[4790]: I0406 11:59:46.963767 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.9637501880000001 podStartE2EDuration="1.963750188s" podCreationTimestamp="2026-04-06 11:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:46.961413866 +0000 UTC m=+165.949156732" watchObservedRunningTime="2026-04-06 11:59:46.963750188 +0000 UTC m=+165.951493054" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.015303 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlwwf" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="registry-server" probeResult="failure" output=< Apr 06 11:59:47 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 11:59:47 crc kubenswrapper[4790]: > Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.096171 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gr22k" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="registry-server" probeResult="failure" output=< Apr 06 11:59:47 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 11:59:47 crc kubenswrapper[4790]: > Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.430736 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:47 crc kubenswrapper[4790]: E0406 11:59:47.431002 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd58353-6603-48cf-9767-11235ba23164" containerName="registry" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.431017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd58353-6603-48cf-9767-11235ba23164" containerName="registry" Apr 06 11:59:47 crc kubenswrapper[4790]: E0406 11:59:47.431032 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d753b3-c003-45df-bce1-4f22f69b03cd" containerName="pruner" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.431039 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d753b3-c003-45df-bce1-4f22f69b03cd" containerName="pruner" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.431161 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd58353-6603-48cf-9767-11235ba23164" containerName="registry" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.431173 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d753b3-c003-45df-bce1-4f22f69b03cd" containerName="pruner" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.431537 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.436318 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.436318 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.437037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.437221 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.437543 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.437808 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.454707 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.551477 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.551571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.551628 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.551668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hp7\" (UniqueName: \"kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.652847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.652934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.652978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.653006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hp7\" (UniqueName: \"kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.654532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.654792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.662576 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.676957 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hp7\" (UniqueName: \"kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7\") pod \"route-controller-manager-5c54549b8b-zmf5g\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.686474 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd58353-6603-48cf-9767-11235ba23164" path="/var/lib/kubelet/pods/9dd58353-6603-48cf-9767-11235ba23164/volumes" Apr 06 11:59:47 crc kubenswrapper[4790]: I0406 11:59:47.755231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.018907 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.430540 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj"] Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.431452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.436446 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.436690 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.437515 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.437950 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438194 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438225 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438561 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438672 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438859 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.438864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.442863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.447697 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.448415 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.456275 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj"] Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.460148 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.461181 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nq8g\" (UniqueName: \"kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579885 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.579986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.580013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682188 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nq8g\" (UniqueName: \"kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682283 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.682446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.683658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.684924 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.684955 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.685132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.686397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.690206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.691586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.692401 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.693375 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.694565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.696701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.698095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.706661 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.718395 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nq8g\" (UniqueName: \"kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g\") pod \"oauth-openshift-6dbc6cf4d9-5zqlj\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.748013 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.966198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" event={"ID":"a29490d3-9c68-4268-a9ce-ab57198d7793","Type":"ContainerStarted","Data":"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9"} Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.967147 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.967168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" event={"ID":"a29490d3-9c68-4268-a9ce-ab57198d7793","Type":"ContainerStarted","Data":"9d3fda8149100c112212718fcd477dad611421c908454dd694206d220a721e9e"} Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.979959 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:48 crc kubenswrapper[4790]: I0406 11:59:48.999608 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" podStartSLOduration=10.999577466 podStartE2EDuration="10.999577466s" podCreationTimestamp="2026-04-06 11:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:48.992784827 +0000 UTC m=+167.980527713" watchObservedRunningTime="2026-04-06 11:59:48.999577466 +0000 UTC m=+167.987320342" Apr 06 11:59:49 crc kubenswrapper[4790]: I0406 11:59:49.079947 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj"] Apr 06 11:59:49 crc kubenswrapper[4790]: I0406 11:59:49.973813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" event={"ID":"0d86c525-b5ed-49c3-b95c-1e1075add472","Type":"ContainerStarted","Data":"598b5477d8da6e675e051c6ca43cdfd29ae2ed6a27248c3c294dadc2b1925491"} Apr 06 11:59:49 crc kubenswrapper[4790]: I0406 11:59:49.974251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" event={"ID":"0d86c525-b5ed-49c3-b95c-1e1075add472","Type":"ContainerStarted","Data":"82ca1975d88d479676cb5e2e199f21e80a606050429b8506b8e617eb514271fe"} Apr 06 11:59:49 crc kubenswrapper[4790]: I0406 11:59:49.974285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:50 crc kubenswrapper[4790]: I0406 11:59:50.003668 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" podStartSLOduration=32.003650104 podStartE2EDuration="32.003650104s" podCreationTimestamp="2026-04-06 11:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 11:59:50.001375253 +0000 UTC m=+168.989118159" watchObservedRunningTime="2026-04-06 11:59:50.003650104 +0000 UTC m=+168.991392960" Apr 06 11:59:50 crc kubenswrapper[4790]: I0406 11:59:50.223821 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 11:59:51 crc kubenswrapper[4790]: I0406 11:59:51.952142 4790 patch_prober.go:28] interesting pod/console-6db6cf4595-5zmct container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.57:8443/health\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Apr 06 11:59:51 crc kubenswrapper[4790]: I0406 11:59:51.952628 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6db6cf4595-5zmct" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.57:8443/health\": dial tcp 10.217.0.57:8443: connect: connection refused" Apr 06 11:59:52 crc kubenswrapper[4790]: I0406 11:59:52.756727 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:52 crc kubenswrapper[4790]: I0406 11:59:52.756784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:52 crc kubenswrapper[4790]: I0406 11:59:52.810769 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:52 crc kubenswrapper[4790]: I0406 11:59:52.998685 4790 generic.go:334] "Generic (PLEG): container finished" podID="a30678d8-35eb-4863-a856-096864c2a9b1" containerID="4472c38ed0d47047495ef44e1bf3c47b36ab7a2747ba57dce8ffde6024c38dc1" exitCode=0 Apr 06 11:59:52 crc kubenswrapper[4790]: I0406 11:59:52.998838 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerDied","Data":"4472c38ed0d47047495ef44e1bf3c47b36ab7a2747ba57dce8ffde6024c38dc1"} Apr 06 11:59:53 crc kubenswrapper[4790]: I0406 11:59:53.077020 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:54 crc kubenswrapper[4790]: I0406 11:59:54.009004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerStarted","Data":"bc22db92c2c9486259f5aa31f52f267884a89288ee7f94157927a276a6294c46"} Apr 06 11:59:54 crc kubenswrapper[4790]: I0406 11:59:54.027246 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-42rgx" podStartSLOduration=2.716237855 podStartE2EDuration="51.027226538s" podCreationTimestamp="2026-04-06 11:59:03 +0000 UTC" firstStartedPulling="2026-04-06 11:59:05.108068801 +0000 UTC m=+124.095811667" lastFinishedPulling="2026-04-06 11:59:53.419057484 +0000 UTC m=+172.406800350" observedRunningTime="2026-04-06 11:59:54.026489029 +0000 UTC m=+173.014231895" watchObservedRunningTime="2026-04-06 11:59:54.027226538 +0000 UTC m=+173.014969404" Apr 06 11:59:54 crc kubenswrapper[4790]: I0406 11:59:54.235692 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:54 crc kubenswrapper[4790]: I0406 11:59:54.262611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:54 crc kubenswrapper[4790]: I0406 11:59:54.262679 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.017189 4790 generic.go:334] "Generic (PLEG): container finished" podID="c176f8e8-2902-4da6-b779-d0426b68e715" containerID="efce8367f2ef1bb168d96d65d272f7151d97df980e3e8498c5ee23e49984effb" exitCode=0 Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.017267 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerDied","Data":"efce8367f2ef1bb168d96d65d272f7151d97df980e3e8498c5ee23e49984effb"} Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.017876 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4mfnk" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="registry-server" containerID="cri-o://ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143" gracePeriod=2 Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.315408 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-42rgx" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="registry-server" probeResult="failure" output=< Apr 06 11:59:55 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 11:59:55 crc kubenswrapper[4790]: > Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.483719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.597746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities\") pod \"25067a9b-e553-4a7a-abdb-226567079c15\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.597859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j57r8\" (UniqueName: \"kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8\") pod \"25067a9b-e553-4a7a-abdb-226567079c15\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.597959 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content\") pod \"25067a9b-e553-4a7a-abdb-226567079c15\" (UID: \"25067a9b-e553-4a7a-abdb-226567079c15\") " Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.599168 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities" (OuterVolumeSpecName: "utilities") pod "25067a9b-e553-4a7a-abdb-226567079c15" (UID: "25067a9b-e553-4a7a-abdb-226567079c15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.606949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8" (OuterVolumeSpecName: "kube-api-access-j57r8") pod "25067a9b-e553-4a7a-abdb-226567079c15" (UID: "25067a9b-e553-4a7a-abdb-226567079c15"). InnerVolumeSpecName "kube-api-access-j57r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.651486 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25067a9b-e553-4a7a-abdb-226567079c15" (UID: "25067a9b-e553-4a7a-abdb-226567079c15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.701205 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j57r8\" (UniqueName: \"kubernetes.io/projected/25067a9b-e553-4a7a-abdb-226567079c15-kube-api-access-j57r8\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.701240 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.701250 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25067a9b-e553-4a7a-abdb-226567079c15-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.776751 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:55 crc kubenswrapper[4790]: I0406 11:59:55.816259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.028282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerStarted","Data":"30936552da913e44060fd6e1ab97d3f0ed81a63d76b5d55cecd30d6d6569df4a"} Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.033127 4790 generic.go:334] "Generic (PLEG): container finished" podID="25067a9b-e553-4a7a-abdb-226567079c15" containerID="ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143" exitCode=0 Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.033206 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mfnk" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.033239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerDied","Data":"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143"} Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.034104 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mfnk" event={"ID":"25067a9b-e553-4a7a-abdb-226567079c15","Type":"ContainerDied","Data":"cfd05a1e3b46d3495738ac36fab286aa110ae634691e8425424daf167cbabed3"} Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.034159 4790 scope.go:117] "RemoveContainer" containerID="ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.039424 4790 generic.go:334] "Generic (PLEG): container finished" podID="cc03077c-04a5-4562-b371-78270cf891ac" containerID="69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62" exitCode=0 Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.039548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerDied","Data":"69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62"} Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.047354 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hfrm" podStartSLOduration=2.667930838 podStartE2EDuration="54.047337991s" podCreationTimestamp="2026-04-06 11:59:02 +0000 UTC" firstStartedPulling="2026-04-06 11:59:04.02330649 +0000 UTC m=+123.011049356" lastFinishedPulling="2026-04-06 11:59:55.402713653 +0000 UTC m=+174.390456509" observedRunningTime="2026-04-06 11:59:56.04350616 +0000 UTC m=+175.031249026" watchObservedRunningTime="2026-04-06 11:59:56.047337991 +0000 UTC m=+175.035080857" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.059164 4790 scope.go:117] "RemoveContainer" containerID="595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.080519 4790 scope.go:117] "RemoveContainer" containerID="a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.090107 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.092350 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4mfnk"] Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.107359 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.109804 4790 scope.go:117] "RemoveContainer" containerID="ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143" Apr 06 11:59:56 crc kubenswrapper[4790]: E0406 11:59:56.110438 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143\": container with ID starting with ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143 not found: ID does not exist" containerID="ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.110486 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143"} err="failed to get container status \"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143\": rpc error: code = NotFound desc = could not find container \"ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143\": container with ID starting with ee2f81cab83dc770b07e229970749eda6311cb85aa6b89866a71d320fc614143 not found: ID does not exist" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.110528 4790 scope.go:117] "RemoveContainer" containerID="595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d" Apr 06 11:59:56 crc kubenswrapper[4790]: E0406 11:59:56.111254 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d\": container with ID starting with 595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d not found: ID does not exist" containerID="595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.111302 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d"} err="failed to get container status \"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d\": rpc error: code = NotFound desc = could not find container \"595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d\": container with ID starting with 595bda2b58439a6b5b3f0511c009686c8b380737729102ea18750582b16a291d not found: ID does not exist" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.111342 4790 scope.go:117] "RemoveContainer" containerID="a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8" Apr 06 11:59:56 crc kubenswrapper[4790]: E0406 11:59:56.111755 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8\": container with ID starting with a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8 not found: ID does not exist" containerID="a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.111799 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8"} err="failed to get container status \"a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8\": rpc error: code = NotFound desc = could not find container \"a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8\": container with ID starting with a22e5cc6265eb54ed0f5a8def9fbf79c270ba00e8ce3097625ab07446a601ea8 not found: ID does not exist" Apr 06 11:59:56 crc kubenswrapper[4790]: I0406 11:59:56.147118 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.046523 4790 generic.go:334] "Generic (PLEG): container finished" podID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerID="1be661ed2762a0b0bd9e6f2bb632f076e93669ffe3857c6a90e00f581a3dbec4" exitCode=0 Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.046598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerDied","Data":"1be661ed2762a0b0bd9e6f2bb632f076e93669ffe3857c6a90e00f581a3dbec4"} Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.050460 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerStarted","Data":"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8"} Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.053080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerStarted","Data":"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863"} Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.115172 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-klccw" podStartSLOduration=2.806946552 podStartE2EDuration="53.115153883s" podCreationTimestamp="2026-04-06 11:59:04 +0000 UTC" firstStartedPulling="2026-04-06 11:59:06.172694989 +0000 UTC m=+125.160437855" lastFinishedPulling="2026-04-06 11:59:56.48090232 +0000 UTC m=+175.468645186" observedRunningTime="2026-04-06 11:59:57.111151698 +0000 UTC m=+176.098894564" watchObservedRunningTime="2026-04-06 11:59:57.115153883 +0000 UTC m=+176.102896749" Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.682095 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25067a9b-e553-4a7a-abdb-226567079c15" path="/var/lib/kubelet/pods/25067a9b-e553-4a7a-abdb-226567079c15/volumes" Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.985691 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 11:59:57 crc kubenswrapper[4790]: I0406 11:59:57.985951 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" podUID="23255321-dfe3-4e65-977f-25ebfafba56b" containerName="controller-manager" containerID="cri-o://4f1d9ccfe4cd5666af9926cfa880fe1aac1fe1f8077b6b6260fc7f6fb13dd06e" gracePeriod=30 Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.002997 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.003212 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" podUID="a29490d3-9c68-4268-a9ce-ab57198d7793" containerName="route-controller-manager" containerID="cri-o://136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9" gracePeriod=30 Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.063007 4790 generic.go:334] "Generic (PLEG): container finished" podID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerID="eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863" exitCode=0 Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.063089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerDied","Data":"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863"} Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.065078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerStarted","Data":"59e6e1b342e3291b99e433a55648dcd953aff92372a6ae30f2fe506ccbf458dd"} Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.110321 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-96trx" podStartSLOduration=3.779026229 podStartE2EDuration="57.110306636s" podCreationTimestamp="2026-04-06 11:59:01 +0000 UTC" firstStartedPulling="2026-04-06 11:59:04.081339039 +0000 UTC m=+123.069081905" lastFinishedPulling="2026-04-06 11:59:57.412619446 +0000 UTC m=+176.400362312" observedRunningTime="2026-04-06 11:59:58.10666596 +0000 UTC m=+177.094408846" watchObservedRunningTime="2026-04-06 11:59:58.110306636 +0000 UTC m=+177.098049502" Apr 06 11:59:58 crc kubenswrapper[4790]: I0406 11:59:58.866444 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.021450 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052289 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 11:59:59 crc kubenswrapper[4790]: E0406 11:59:59.052508 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="registry-server" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052522 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="registry-server" Apr 06 11:59:59 crc kubenswrapper[4790]: E0406 11:59:59.052537 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="extract-content" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052545 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="extract-content" Apr 06 11:59:59 crc kubenswrapper[4790]: E0406 11:59:59.052562 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="extract-utilities" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052572 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="extract-utilities" Apr 06 11:59:59 crc kubenswrapper[4790]: E0406 11:59:59.052584 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29490d3-9c68-4268-a9ce-ab57198d7793" containerName="route-controller-manager" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052592 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29490d3-9c68-4268-a9ce-ab57198d7793" containerName="route-controller-manager" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052708 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29490d3-9c68-4268-a9ce-ab57198d7793" containerName="route-controller-manager" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.052721 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="25067a9b-e553-4a7a-abdb-226567079c15" containerName="registry-server" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.053198 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.075179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerStarted","Data":"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b"} Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.077866 4790 generic.go:334] "Generic (PLEG): container finished" podID="23255321-dfe3-4e65-977f-25ebfafba56b" containerID="4f1d9ccfe4cd5666af9926cfa880fe1aac1fe1f8077b6b6260fc7f6fb13dd06e" exitCode=0 Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.077943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" event={"ID":"23255321-dfe3-4e65-977f-25ebfafba56b","Type":"ContainerDied","Data":"4f1d9ccfe4cd5666af9926cfa880fe1aac1fe1f8077b6b6260fc7f6fb13dd06e"} Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.080253 4790 generic.go:334] "Generic (PLEG): container finished" podID="a29490d3-9c68-4268-a9ce-ab57198d7793" containerID="136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9" exitCode=0 Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.080285 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" event={"ID":"a29490d3-9c68-4268-a9ce-ab57198d7793","Type":"ContainerDied","Data":"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9"} Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.080309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" event={"ID":"a29490d3-9c68-4268-a9ce-ab57198d7793","Type":"ContainerDied","Data":"9d3fda8149100c112212718fcd477dad611421c908454dd694206d220a721e9e"} Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.080316 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.080329 4790 scope.go:117] "RemoveContainer" containerID="136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.099510 4790 scope.go:117] "RemoveContainer" containerID="136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9" Apr 06 11:59:59 crc kubenswrapper[4790]: E0406 11:59:59.100361 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9\": container with ID starting with 136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9 not found: ID does not exist" containerID="136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.100488 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9"} err="failed to get container status \"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9\": rpc error: code = NotFound desc = could not find container \"136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9\": container with ID starting with 136952368aeffba4ae451d8e1dfe78a77c46609be9bb973eb1eb13b1bf439ee9 not found: ID does not exist" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.102506 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.102706 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-msclk" podStartSLOduration=2.676156491 podStartE2EDuration="57.102692536s" podCreationTimestamp="2026-04-06 11:59:02 +0000 UTC" firstStartedPulling="2026-04-06 11:59:04.053625387 +0000 UTC m=+123.041368253" lastFinishedPulling="2026-04-06 11:59:58.480161432 +0000 UTC m=+177.467904298" observedRunningTime="2026-04-06 11:59:59.097820437 +0000 UTC m=+178.085563313" watchObservedRunningTime="2026-04-06 11:59:59.102692536 +0000 UTC m=+178.090435402" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158203 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert\") pod \"a29490d3-9c68-4268-a9ce-ab57198d7793\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hp7\" (UniqueName: \"kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7\") pod \"a29490d3-9c68-4268-a9ce-ab57198d7793\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config\") pod \"a29490d3-9c68-4268-a9ce-ab57198d7793\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca\") pod \"a29490d3-9c68-4268-a9ce-ab57198d7793\" (UID: \"a29490d3-9c68-4268-a9ce-ab57198d7793\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158741 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158808 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158860 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.158894 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrsz\" (UniqueName: \"kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.159345 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca" (OuterVolumeSpecName: "client-ca") pod "a29490d3-9c68-4268-a9ce-ab57198d7793" (UID: "a29490d3-9c68-4268-a9ce-ab57198d7793"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.159552 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config" (OuterVolumeSpecName: "config") pod "a29490d3-9c68-4268-a9ce-ab57198d7793" (UID: "a29490d3-9c68-4268-a9ce-ab57198d7793"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.161115 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.167981 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a29490d3-9c68-4268-a9ce-ab57198d7793" (UID: "a29490d3-9c68-4268-a9ce-ab57198d7793"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.168060 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7" (OuterVolumeSpecName: "kube-api-access-m2hp7") pod "a29490d3-9c68-4268-a9ce-ab57198d7793" (UID: "a29490d3-9c68-4268-a9ce-ab57198d7793"). InnerVolumeSpecName "kube-api-access-m2hp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.260263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca\") pod \"23255321-dfe3-4e65-977f-25ebfafba56b\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.260982 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-699f7\" (UniqueName: \"kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7\") pod \"23255321-dfe3-4e65-977f-25ebfafba56b\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert\") pod \"23255321-dfe3-4e65-977f-25ebfafba56b\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261142 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles\") pod \"23255321-dfe3-4e65-977f-25ebfafba56b\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config\") pod \"23255321-dfe3-4e65-977f-25ebfafba56b\" (UID: \"23255321-dfe3-4e65-977f-25ebfafba56b\") " Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261468 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxrsz\" (UniqueName: \"kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261671 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a29490d3-9c68-4268-a9ce-ab57198d7793-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261685 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hp7\" (UniqueName: \"kubernetes.io/projected/a29490d3-9c68-4268-a9ce-ab57198d7793-kube-api-access-m2hp7\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261698 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.261850 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a29490d3-9c68-4268-a9ce-ab57198d7793-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.262269 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "23255321-dfe3-4e65-977f-25ebfafba56b" (UID: "23255321-dfe3-4e65-977f-25ebfafba56b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.262423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca" (OuterVolumeSpecName: "client-ca") pod "23255321-dfe3-4e65-977f-25ebfafba56b" (UID: "23255321-dfe3-4e65-977f-25ebfafba56b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.262461 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config" (OuterVolumeSpecName: "config") pod "23255321-dfe3-4e65-977f-25ebfafba56b" (UID: "23255321-dfe3-4e65-977f-25ebfafba56b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.262543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.262856 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.265515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.265677 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7" (OuterVolumeSpecName: "kube-api-access-699f7") pod "23255321-dfe3-4e65-977f-25ebfafba56b" (UID: "23255321-dfe3-4e65-977f-25ebfafba56b"). InnerVolumeSpecName "kube-api-access-699f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.272023 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23255321-dfe3-4e65-977f-25ebfafba56b" (UID: "23255321-dfe3-4e65-977f-25ebfafba56b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.285708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxrsz\" (UniqueName: \"kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz\") pod \"route-controller-manager-59fc5646d-7nzmc\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.362969 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23255321-dfe3-4e65-977f-25ebfafba56b-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.363002 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.363016 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-config\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.363025 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23255321-dfe3-4e65-977f-25ebfafba56b-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.363034 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-699f7\" (UniqueName: \"kubernetes.io/projected/23255321-dfe3-4e65-977f-25ebfafba56b-kube-api-access-699f7\") on node \"crc\" DevicePath \"\"" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.368914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.415549 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.417654 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c54549b8b-zmf5g"] Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.647031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 11:59:59 crc kubenswrapper[4790]: I0406 11:59:59.684856 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29490d3-9c68-4268-a9ce-ab57198d7793" path="/var/lib/kubelet/pods/a29490d3-9c68-4268-a9ce-ab57198d7793/volumes" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.037269 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.037528 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gr22k" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="registry-server" containerID="cri-o://6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9" gracePeriod=2 Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.087527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" event={"ID":"bb925f43-3504-488a-a2be-a8957067b40c","Type":"ContainerStarted","Data":"661e3b45ced01bc8f6b958d333b4e25397e2de1a8726e4eb6c2ecff796cf84f3"} Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.087582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" event={"ID":"bb925f43-3504-488a-a2be-a8957067b40c","Type":"ContainerStarted","Data":"0e7848b3b04da3ec6ed84e791c044cab1046f9e6750a7ae35dee11e6fcf4d688"} Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.087877 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.090614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" event={"ID":"23255321-dfe3-4e65-977f-25ebfafba56b","Type":"ContainerDied","Data":"546d1ed39951eb3afec5daf52d58b355103682f347bc050e134ba1dc6fc90454"} Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.090664 4790 scope.go:117] "RemoveContainer" containerID="4f1d9ccfe4cd5666af9926cfa880fe1aac1fe1f8077b6b6260fc7f6fb13dd06e" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.090705 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9875f499-4zz67" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.104597 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" podStartSLOduration=2.104579915 podStartE2EDuration="2.104579915s" podCreationTimestamp="2026-04-06 11:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:00:00.101915335 +0000 UTC m=+179.089658201" watchObservedRunningTime="2026-04-06 12:00:00.104579915 +0000 UTC m=+179.092322781" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.140355 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.148371 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz"] Apr 06 12:00:00 crc kubenswrapper[4790]: E0406 12:00:00.148669 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23255321-dfe3-4e65-977f-25ebfafba56b" containerName="controller-manager" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.148682 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="23255321-dfe3-4e65-977f-25ebfafba56b" containerName="controller-manager" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.148806 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="23255321-dfe3-4e65-977f-25ebfafba56b" containerName="controller-manager" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.149216 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.165276 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.165503 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.173031 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c9875f499-4zz67"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.184657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.193572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.235963 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591280-h6r65"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.236927 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.239687 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591280-h6r65"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.244066 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.244149 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.244070 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.280627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.280702 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.280781 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvfs\" (UniqueName: \"kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.351522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.382538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.382620 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.382696 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npzzc\" (UniqueName: \"kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc\") pod \"auto-csr-approver-29591280-h6r65\" (UID: \"8abd742f-e504-47d0-ab97-5befd3609dd7\") " pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.382741 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvfs\" (UniqueName: \"kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.383797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.392858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.405045 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-965d55b94-4rql6"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.416816 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvfs\" (UniqueName: \"kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs\") pod \"collect-profiles-29591280-pqhhz\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.484221 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npzzc\" (UniqueName: \"kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc\") pod \"auto-csr-approver-29591280-h6r65\" (UID: \"8abd742f-e504-47d0-ab97-5befd3609dd7\") " pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.501080 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.510682 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npzzc\" (UniqueName: \"kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc\") pod \"auto-csr-approver-29591280-h6r65\" (UID: \"8abd742f-e504-47d0-ab97-5befd3609dd7\") " pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.555045 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.769001 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591280-h6r65"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.898342 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.950540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz"] Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.997255 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities\") pod \"4aa8f195-5c23-43d2-abba-e4c825b4450e\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.997452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdbgc\" (UniqueName: \"kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc\") pod \"4aa8f195-5c23-43d2-abba-e4c825b4450e\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.997475 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content\") pod \"4aa8f195-5c23-43d2-abba-e4c825b4450e\" (UID: \"4aa8f195-5c23-43d2-abba-e4c825b4450e\") " Apr 06 12:00:00 crc kubenswrapper[4790]: I0406 12:00:00.999523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities" (OuterVolumeSpecName: "utilities") pod "4aa8f195-5c23-43d2-abba-e4c825b4450e" (UID: "4aa8f195-5c23-43d2-abba-e4c825b4450e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.009097 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc" (OuterVolumeSpecName: "kube-api-access-cdbgc") pod "4aa8f195-5c23-43d2-abba-e4c825b4450e" (UID: "4aa8f195-5c23-43d2-abba-e4c825b4450e"). InnerVolumeSpecName "kube-api-access-cdbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.098559 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdbgc\" (UniqueName: \"kubernetes.io/projected/4aa8f195-5c23-43d2-abba-e4c825b4450e-kube-api-access-cdbgc\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.098604 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.103141 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" event={"ID":"d509c077-8950-44b6-99fb-7d4ebf81f4da","Type":"ContainerStarted","Data":"739c5afefd94bd8265cde40d1657834acfef08087a4c2dc7150d139d8d1785b0"} Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.106270 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591280-h6r65" event={"ID":"8abd742f-e504-47d0-ab97-5befd3609dd7","Type":"ContainerStarted","Data":"5fa93c3b9a69007fbaca83ddd43d83b305166828d57d1f9071acc6e8c634c44e"} Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.108548 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerID="6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9" exitCode=0 Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.108876 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gr22k" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.108891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerDied","Data":"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9"} Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.108939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gr22k" event={"ID":"4aa8f195-5c23-43d2-abba-e4c825b4450e","Type":"ContainerDied","Data":"90253820fa37c973dfb4a2d5e58563587a90f2dfb2258dbcaf6bcd1653f9abbc"} Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.108960 4790 scope.go:117] "RemoveContainer" containerID="6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.153708 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa8f195-5c23-43d2-abba-e4c825b4450e" (UID: "4aa8f195-5c23-43d2-abba-e4c825b4450e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.158592 4790 scope.go:117] "RemoveContainer" containerID="a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.175474 4790 scope.go:117] "RemoveContainer" containerID="f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.199645 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f195-5c23-43d2-abba-e4c825b4450e-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.205656 4790 scope.go:117] "RemoveContainer" containerID="6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9" Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.206389 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9\": container with ID starting with 6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9 not found: ID does not exist" containerID="6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.206425 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9"} err="failed to get container status \"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9\": rpc error: code = NotFound desc = could not find container \"6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9\": container with ID starting with 6be4f4208b1f5e788c2ad995e3b4437ea13e85b429fc25e63d7f64e2526e29e9 not found: ID does not exist" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.206449 4790 scope.go:117] "RemoveContainer" containerID="a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23" Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.206742 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23\": container with ID starting with a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23 not found: ID does not exist" containerID="a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.206768 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23"} err="failed to get container status \"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23\": rpc error: code = NotFound desc = could not find container \"a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23\": container with ID starting with a1d02f2b3fc50e988c01ec065baa3761ee7e08cc5a46bd137586e13d05ce2d23 not found: ID does not exist" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.206783 4790 scope.go:117] "RemoveContainer" containerID="f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a" Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.207179 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a\": container with ID starting with f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a not found: ID does not exist" containerID="f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.207199 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a"} err="failed to get container status \"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a\": rpc error: code = NotFound desc = could not find container \"f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a\": container with ID starting with f60e0e45c4d785f21ce9d86907fff0803dad4dd58e77ab5ce9e5fb33d847aa5a not found: ID does not exist" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.442681 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.442952 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="registry-server" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.442965 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="registry-server" Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.442978 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="extract-content" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.442986 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="extract-content" Apr 06 12:00:01 crc kubenswrapper[4790]: E0406 12:00:01.442996 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="extract-utilities" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.443002 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="extract-utilities" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.443097 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" containerName="registry-server" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.443474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.447021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.447147 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.448959 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.449005 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.448966 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.451146 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.456175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.457016 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.504353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-224kk\" (UniqueName: \"kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.504403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.504649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.504805 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.506955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.522335 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.525803 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gr22k"] Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.607598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.607639 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.607665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.607716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-224kk\" (UniqueName: \"kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.608499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.612760 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.612783 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.618220 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.619138 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.620164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.620810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.621004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.630896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.640073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.641604 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.653261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-224kk\" (UniqueName: \"kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk\") pod \"controller-manager-6cc46b5549-6nr5f\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.684559 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23255321-dfe3-4e65-977f-25ebfafba56b" path="/var/lib/kubelet/pods/23255321-dfe3-4e65-977f-25ebfafba56b/volumes" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.685548 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa8f195-5c23-43d2-abba-e4c825b4450e" path="/var/lib/kubelet/pods/4aa8f195-5c23-43d2-abba-e4c825b4450e/volumes" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.822513 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.831148 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.957985 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 12:00:01 crc kubenswrapper[4790]: I0406 12:00:01.961697 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.021478 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dkg9c"] Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.028098 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:02 crc kubenswrapper[4790]: W0406 12:00:02.042115 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964c6e60_51e3_487b_91e8_b3ecc21ccdd8.slice/crio-9dc3d9446a40ba535293d7ef79aab50b7f568ba66481ead7fae0537436f7c5d8 WatchSource:0}: Error finding container 9dc3d9446a40ba535293d7ef79aab50b7f568ba66481ead7fae0537436f7c5d8: Status 404 returned error can't find the container with id 9dc3d9446a40ba535293d7ef79aab50b7f568ba66481ead7fae0537436f7c5d8 Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.132115 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" event={"ID":"964c6e60-51e3-487b-91e8-b3ecc21ccdd8","Type":"ContainerStarted","Data":"9dc3d9446a40ba535293d7ef79aab50b7f568ba66481ead7fae0537436f7c5d8"} Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.136940 4790 generic.go:334] "Generic (PLEG): container finished" podID="d509c077-8950-44b6-99fb-7d4ebf81f4da" containerID="8f3fec1b92ce16a9d4b3ddfa7a76e1315fd31ad32d1639449661a27f263cdaf5" exitCode=0 Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.138332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" event={"ID":"d509c077-8950-44b6-99fb-7d4ebf81f4da","Type":"ContainerDied","Data":"8f3fec1b92ce16a9d4b3ddfa7a76e1315fd31ad32d1639449661a27f263cdaf5"} Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.292895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.292955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.335561 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.536522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.536593 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:00:02 crc kubenswrapper[4790]: I0406 12:00:02.576572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.058931 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.059017 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.121009 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.147978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" event={"ID":"964c6e60-51e3-487b-91e8-b3ecc21ccdd8","Type":"ContainerStarted","Data":"d7369a0ea7c9fdde8eba69f5d42cb2cd66c949ce479a9e6cc89e71c9c592ee5c"} Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.176046 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" podStartSLOduration=6.175943013 podStartE2EDuration="6.175943013s" podCreationTimestamp="2026-04-06 11:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:00:03.172622056 +0000 UTC m=+182.160364952" watchObservedRunningTime="2026-04-06 12:00:03.175943013 +0000 UTC m=+182.163685889" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.198999 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.207240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.207595 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.504231 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.658173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume\") pod \"d509c077-8950-44b6-99fb-7d4ebf81f4da\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.658526 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nvfs\" (UniqueName: \"kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs\") pod \"d509c077-8950-44b6-99fb-7d4ebf81f4da\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.659281 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume" (OuterVolumeSpecName: "config-volume") pod "d509c077-8950-44b6-99fb-7d4ebf81f4da" (UID: "d509c077-8950-44b6-99fb-7d4ebf81f4da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.659796 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume\") pod \"d509c077-8950-44b6-99fb-7d4ebf81f4da\" (UID: \"d509c077-8950-44b6-99fb-7d4ebf81f4da\") " Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.660080 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d509c077-8950-44b6-99fb-7d4ebf81f4da-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.665074 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d509c077-8950-44b6-99fb-7d4ebf81f4da" (UID: "d509c077-8950-44b6-99fb-7d4ebf81f4da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.665193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs" (OuterVolumeSpecName: "kube-api-access-9nvfs") pod "d509c077-8950-44b6-99fb-7d4ebf81f4da" (UID: "d509c077-8950-44b6-99fb-7d4ebf81f4da"). InnerVolumeSpecName "kube-api-access-9nvfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.762973 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nvfs\" (UniqueName: \"kubernetes.io/projected/d509c077-8950-44b6-99fb-7d4ebf81f4da-kube-api-access-9nvfs\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:03 crc kubenswrapper[4790]: I0406 12:00:03.763022 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d509c077-8950-44b6-99fb-7d4ebf81f4da-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.155522 4790 generic.go:334] "Generic (PLEG): container finished" podID="4029e155-0c45-49cb-a25b-ddb1f768a88f" containerID="574bee7f89a434a796c3dc9fc8f439107867110b7ceb45c4918ca3fd11bee774" exitCode=0 Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.155598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" event={"ID":"4029e155-0c45-49cb-a25b-ddb1f768a88f","Type":"ContainerDied","Data":"574bee7f89a434a796c3dc9fc8f439107867110b7ceb45c4918ca3fd11bee774"} Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.156130 4790 scope.go:117] "RemoveContainer" containerID="574bee7f89a434a796c3dc9fc8f439107867110b7ceb45c4918ca3fd11bee774" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.165993 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.166067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz" event={"ID":"d509c077-8950-44b6-99fb-7d4ebf81f4da","Type":"ContainerDied","Data":"739c5afefd94bd8265cde40d1657834acfef08087a4c2dc7150d139d8d1785b0"} Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.166139 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739c5afefd94bd8265cde40d1657834acfef08087a4c2dc7150d139d8d1785b0" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.166868 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.172276 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.311533 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.357472 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.679895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.679960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:04 crc kubenswrapper[4790]: I0406 12:00:04.733887 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:05 crc kubenswrapper[4790]: I0406 12:00:05.176973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hrvkd" event={"ID":"4029e155-0c45-49cb-a25b-ddb1f768a88f","Type":"ContainerStarted","Data":"a6c8ceddfbaa426ec9c67e8ef7d6b72472ddc6a8c71dc26bb1874dd8053a498d"} Apr 06 12:00:05 crc kubenswrapper[4790]: I0406 12:00:05.223343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:06 crc kubenswrapper[4790]: I0406 12:00:06.438517 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 12:00:06 crc kubenswrapper[4790]: I0406 12:00:06.439409 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hfrm" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="registry-server" containerID="cri-o://30936552da913e44060fd6e1ab97d3f0ed81a63d76b5d55cecd30d6d6569df4a" gracePeriod=2 Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.196012 4790 generic.go:334] "Generic (PLEG): container finished" podID="c176f8e8-2902-4da6-b779-d0426b68e715" containerID="30936552da913e44060fd6e1ab97d3f0ed81a63d76b5d55cecd30d6d6569df4a" exitCode=0 Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.196061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerDied","Data":"30936552da913e44060fd6e1ab97d3f0ed81a63d76b5d55cecd30d6d6569df4a"} Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.197950 4790 generic.go:334] "Generic (PLEG): container finished" podID="84f1a57a-9a68-4f5a-a0df-3c2bda96d02f" containerID="977375ef20f324452de6a467a015308bc2d8832c3550b837bf2b5b8afff55644" exitCode=0 Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.197996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" event={"ID":"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f","Type":"ContainerDied","Data":"977375ef20f324452de6a467a015308bc2d8832c3550b837bf2b5b8afff55644"} Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.199596 4790 scope.go:117] "RemoveContainer" containerID="977375ef20f324452de6a467a015308bc2d8832c3550b837bf2b5b8afff55644" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.307938 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.418624 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities\") pod \"c176f8e8-2902-4da6-b779-d0426b68e715\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.418731 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content\") pod \"c176f8e8-2902-4da6-b779-d0426b68e715\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.418780 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn94d\" (UniqueName: \"kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d\") pod \"c176f8e8-2902-4da6-b779-d0426b68e715\" (UID: \"c176f8e8-2902-4da6-b779-d0426b68e715\") " Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.420532 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities" (OuterVolumeSpecName: "utilities") pod "c176f8e8-2902-4da6-b779-d0426b68e715" (UID: "c176f8e8-2902-4da6-b779-d0426b68e715"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.437086 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d" (OuterVolumeSpecName: "kube-api-access-tn94d") pod "c176f8e8-2902-4da6-b779-d0426b68e715" (UID: "c176f8e8-2902-4da6-b779-d0426b68e715"). InnerVolumeSpecName "kube-api-access-tn94d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.474790 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c176f8e8-2902-4da6-b779-d0426b68e715" (UID: "c176f8e8-2902-4da6-b779-d0426b68e715"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.519929 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.519970 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn94d\" (UniqueName: \"kubernetes.io/projected/c176f8e8-2902-4da6-b779-d0426b68e715-kube-api-access-tn94d\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:07 crc kubenswrapper[4790]: I0406 12:00:07.519985 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c176f8e8-2902-4da6-b779-d0426b68e715-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.208250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hfrm" event={"ID":"c176f8e8-2902-4da6-b779-d0426b68e715","Type":"ContainerDied","Data":"e89d54719360f164ec11aa844316310401b8f2f9b3e0b2151a4d19ea448cc585"} Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.208316 4790 scope.go:117] "RemoveContainer" containerID="30936552da913e44060fd6e1ab97d3f0ed81a63d76b5d55cecd30d6d6569df4a" Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.208376 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hfrm" Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.211182 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r5lgx" event={"ID":"84f1a57a-9a68-4f5a-a0df-3c2bda96d02f","Type":"ContainerStarted","Data":"bb599691f5f4ad4dc0d3760d041ec5f37073f4b23a4180ff12905e7182bd6a6b"} Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.252478 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.259944 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hfrm"] Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.291919 4790 scope.go:117] "RemoveContainer" containerID="efce8367f2ef1bb168d96d65d272f7151d97df980e3e8498c5ee23e49984effb" Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.308282 4790 scope.go:117] "RemoveContainer" containerID="504ee680158d73e888a8ee328def1f97e4a6eae392fdbf5184f050b6c825c79c" Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.643494 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 12:00:08 crc kubenswrapper[4790]: I0406 12:00:08.643779 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-klccw" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="registry-server" containerID="cri-o://497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8" gracePeriod=2 Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.144750 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.221709 4790 generic.go:334] "Generic (PLEG): container finished" podID="cc03077c-04a5-4562-b371-78270cf891ac" containerID="497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8" exitCode=0 Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.221808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerDied","Data":"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8"} Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.221875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klccw" event={"ID":"cc03077c-04a5-4562-b371-78270cf891ac","Type":"ContainerDied","Data":"8ed513715a10badd4d02bee2e84210ec64bea6d4de5821b97ad4940fd7184d38"} Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.221874 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klccw" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.221903 4790 scope.go:117] "RemoveContainer" containerID="497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.243072 4790 scope.go:117] "RemoveContainer" containerID="69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.246386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities\") pod \"cc03077c-04a5-4562-b371-78270cf891ac\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.246462 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content\") pod \"cc03077c-04a5-4562-b371-78270cf891ac\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.246499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzc9\" (UniqueName: \"kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9\") pod \"cc03077c-04a5-4562-b371-78270cf891ac\" (UID: \"cc03077c-04a5-4562-b371-78270cf891ac\") " Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.248336 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities" (OuterVolumeSpecName: "utilities") pod "cc03077c-04a5-4562-b371-78270cf891ac" (UID: "cc03077c-04a5-4562-b371-78270cf891ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.252102 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9" (OuterVolumeSpecName: "kube-api-access-spzc9") pod "cc03077c-04a5-4562-b371-78270cf891ac" (UID: "cc03077c-04a5-4562-b371-78270cf891ac"). InnerVolumeSpecName "kube-api-access-spzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.260403 4790 scope.go:117] "RemoveContainer" containerID="0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.275683 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc03077c-04a5-4562-b371-78270cf891ac" (UID: "cc03077c-04a5-4562-b371-78270cf891ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.278675 4790 scope.go:117] "RemoveContainer" containerID="497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8" Apr 06 12:00:09 crc kubenswrapper[4790]: E0406 12:00:09.279106 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8\": container with ID starting with 497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8 not found: ID does not exist" containerID="497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.279142 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8"} err="failed to get container status \"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8\": rpc error: code = NotFound desc = could not find container \"497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8\": container with ID starting with 497044373d3098272dad636ef0a7854e07b3ab134103f31e771654fb93c955b8 not found: ID does not exist" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.279165 4790 scope.go:117] "RemoveContainer" containerID="69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62" Apr 06 12:00:09 crc kubenswrapper[4790]: E0406 12:00:09.279420 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62\": container with ID starting with 69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62 not found: ID does not exist" containerID="69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.279449 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62"} err="failed to get container status \"69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62\": rpc error: code = NotFound desc = could not find container \"69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62\": container with ID starting with 69d9c282377cbd2ef313c7961ec338afc3c912e1b0d55f43a91a39544bd66e62 not found: ID does not exist" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.279465 4790 scope.go:117] "RemoveContainer" containerID="0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0" Apr 06 12:00:09 crc kubenswrapper[4790]: E0406 12:00:09.279727 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0\": container with ID starting with 0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0 not found: ID does not exist" containerID="0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.279752 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0"} err="failed to get container status \"0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0\": rpc error: code = NotFound desc = could not find container \"0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0\": container with ID starting with 0a5cc8b0074fd6a7c1bc22e4f580bcdb07df2c4f9420be51fc323f06597cbee0 not found: ID does not exist" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.348309 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.348349 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03077c-04a5-4562-b371-78270cf891ac-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.348373 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzc9\" (UniqueName: \"kubernetes.io/projected/cc03077c-04a5-4562-b371-78270cf891ac-kube-api-access-spzc9\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.552277 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.560192 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-klccw"] Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.684449 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" path="/var/lib/kubelet/pods/c176f8e8-2902-4da6-b779-d0426b68e715/volumes" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.686730 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc03077c-04a5-4562-b371-78270cf891ac" path="/var/lib/kubelet/pods/cc03077c-04a5-4562-b371-78270cf891ac/volumes" Apr 06 12:00:09 crc kubenswrapper[4790]: I0406 12:00:09.726878 4790 ???:1] "http: TLS handshake error from 192.168.126.11:33636: no serving certificate available for the kubelet" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.246869 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_5ed2aa0a-1e35-4436-b184-a680dcfd14df/installer/0.log" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.247314 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" containerID="246286413f6b7571200bf664027f62be790ab64bf5259adbfe65f009630c0538" exitCode=1 Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.247345 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"5ed2aa0a-1e35-4436-b184-a680dcfd14df","Type":"ContainerDied","Data":"246286413f6b7571200bf664027f62be790ab64bf5259adbfe65f009630c0538"} Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.368425 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.368730 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://5ff3cd33d151386ab7f09a28ee0f37001f983d35e8389644ab342b12e21638f8" gracePeriod=30 Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.368899 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" containerID="cri-o://43286ddd309fd46a3d6c104d273a7737af676a9fc12d58c5b2de11f9bad526be" gracePeriod=30 Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.368962 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" containerID="cri-o://30056923772f422514c90950e54b7bcd14f25a556777a669012dbbb7dbcb5b6d" gracePeriod=30 Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.371830 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372237 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372255 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372276 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="extract-utilities" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372284 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="extract-utilities" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372304 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372316 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="extract-content" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372323 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="extract-content" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372333 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d509c077-8950-44b6-99fb-7d4ebf81f4da" containerName="collect-profiles" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372341 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d509c077-8950-44b6-99fb-7d4ebf81f4da" containerName="collect-profiles" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372360 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372369 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="wait-for-host-port" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372377 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="wait-for-host-port" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372386 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372393 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372407 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="extract-content" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372415 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="extract-content" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372427 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372434 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 06 12:00:12 crc kubenswrapper[4790]: E0406 12:00:12.372447 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="extract-utilities" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372456 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="extract-utilities" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372581 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372592 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d509c077-8950-44b6-99fb-7d4ebf81f4da" containerName="collect-profiles" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372604 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc03077c-04a5-4562-b371-78270cf891ac" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372620 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c176f8e8-2902-4da6-b779-d0426b68e715" containerName="registry-server" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372631 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-cert-syncer" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.372642 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler-recovery-controller" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.498851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.498916 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.561940 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_5ed2aa0a-1e35-4436-b184-a680dcfd14df/installer/0.log" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.562026 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.583027 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.590359 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.592905 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.597280 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.600274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.600327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.600446 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.600524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"3dcd261975c3d6b9a6ad6367fd4facd3\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701652 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir\") pod \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701668 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"3dcd261975c3d6b9a6ad6367fd4facd3\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701697 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock\") pod \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701733 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access\") pod \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\" (UID: \"5ed2aa0a-1e35-4436-b184-a680dcfd14df\") " Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3dcd261975c3d6b9a6ad6367fd4facd3" (UID: "3dcd261975c3d6b9a6ad6367fd4facd3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock" (OuterVolumeSpecName: "var-lock") pod "5ed2aa0a-1e35-4436-b184-a680dcfd14df" (UID: "5ed2aa0a-1e35-4436-b184-a680dcfd14df"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701871 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5ed2aa0a-1e35-4436-b184-a680dcfd14df" (UID: "5ed2aa0a-1e35-4436-b184-a680dcfd14df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.701940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3dcd261975c3d6b9a6ad6367fd4facd3" (UID: "3dcd261975c3d6b9a6ad6367fd4facd3"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.702628 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.702699 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.702723 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.702745 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5ed2aa0a-1e35-4436-b184-a680dcfd14df-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.708243 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5ed2aa0a-1e35-4436-b184-a680dcfd14df" (UID: "5ed2aa0a-1e35-4436-b184-a680dcfd14df"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:12 crc kubenswrapper[4790]: I0406 12:00:12.804332 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ed2aa0a-1e35-4436-b184-a680dcfd14df-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.253628 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-cert-syncer/0.log" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.254814 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="43286ddd309fd46a3d6c104d273a7737af676a9fc12d58c5b2de11f9bad526be" exitCode=0 Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.254974 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="30056923772f422514c90950e54b7bcd14f25a556777a669012dbbb7dbcb5b6d" exitCode=2 Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.255076 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5ff3cd33d151386ab7f09a28ee0f37001f983d35e8389644ab342b12e21638f8" exitCode=0 Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.254902 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.255214 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0625bba6ea54e575b2c6e00451476436b8108b21930185b2bbc4eff000d1a28e" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.256658 4790 generic.go:334] "Generic (PLEG): container finished" podID="1d3bc18a-ca66-44f2-9667-48dd85b638fe" containerID="e9d3bbf46daddb764961894bf1bef4406a86ed3ae92c0b61a6505ec1cf2dbd96" exitCode=0 Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.256683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" event={"ID":"1d3bc18a-ca66-44f2-9667-48dd85b638fe","Type":"ContainerDied","Data":"e9d3bbf46daddb764961894bf1bef4406a86ed3ae92c0b61a6505ec1cf2dbd96"} Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.257290 4790 scope.go:117] "RemoveContainer" containerID="e9d3bbf46daddb764961894bf1bef4406a86ed3ae92c0b61a6505ec1cf2dbd96" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.258621 4790 generic.go:334] "Generic (PLEG): container finished" podID="e458c724-3942-4d11-80fb-e42973fb2b28" containerID="132764e02cc0c9e76e96cf7b32af3d88039f53361c8d69592efaedf29a5b3eb3" exitCode=0 Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.258678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"e458c724-3942-4d11-80fb-e42973fb2b28","Type":"ContainerDied","Data":"132764e02cc0c9e76e96cf7b32af3d88039f53361c8d69592efaedf29a5b3eb3"} Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.267513 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.269259 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-10-crc_5ed2aa0a-1e35-4436-b184-a680dcfd14df/installer/0.log" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.269571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-10-crc" event={"ID":"5ed2aa0a-1e35-4436-b184-a680dcfd14df","Type":"ContainerDied","Data":"0e47180f70daa4179ab73b3c5104c73853df89da894829db9df44a5a5198a67b"} Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.269696 4790 scope.go:117] "RemoveContainer" containerID="246286413f6b7571200bf664027f62be790ab64bf5259adbfe65f009630c0538" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.270008 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-10-crc" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.340058 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="3dcd261975c3d6b9a6ad6367fd4facd3" podUID="815516d0756bb9282f4d0a28cef72670" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.342786 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.347473 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-10-crc"] Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.685139 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" path="/var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/volumes" Apr 06 12:00:13 crc kubenswrapper[4790]: I0406 12:00:13.686576 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" path="/var/lib/kubelet/pods/5ed2aa0a-1e35-4436-b184-a680dcfd14df/volumes" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.268510 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj"] Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.280620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jqzrz" event={"ID":"1d3bc18a-ca66-44f2-9667-48dd85b638fe","Type":"ContainerStarted","Data":"c779e60c42f5257677c201f81a832eb7b8604111c4d1efb3315ca0df3cf77dcd"} Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.675568 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.836637 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir\") pod \"e458c724-3942-4d11-80fb-e42973fb2b28\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.836694 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock\") pod \"e458c724-3942-4d11-80fb-e42973fb2b28\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.836729 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access\") pod \"e458c724-3942-4d11-80fb-e42973fb2b28\" (UID: \"e458c724-3942-4d11-80fb-e42973fb2b28\") " Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.836750 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e458c724-3942-4d11-80fb-e42973fb2b28" (UID: "e458c724-3942-4d11-80fb-e42973fb2b28"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.836806 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock" (OuterVolumeSpecName: "var-lock") pod "e458c724-3942-4d11-80fb-e42973fb2b28" (UID: "e458c724-3942-4d11-80fb-e42973fb2b28"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.844146 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.844195 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e458c724-3942-4d11-80fb-e42973fb2b28-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.853262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e458c724-3942-4d11-80fb-e42973fb2b28" (UID: "e458c724-3942-4d11-80fb-e42973fb2b28"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:14 crc kubenswrapper[4790]: I0406 12:00:14.945803 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e458c724-3942-4d11-80fb-e42973fb2b28-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:15 crc kubenswrapper[4790]: I0406 12:00:15.295223 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-crc" event={"ID":"e458c724-3942-4d11-80fb-e42973fb2b28","Type":"ContainerDied","Data":"77cf7bb560bc98265a0c61f9bad5660afd5e8aebbbaebdc73b0701bd43354835"} Apr 06 12:00:15 crc kubenswrapper[4790]: I0406 12:00:15.295271 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77cf7bb560bc98265a0c61f9bad5660afd5e8aebbbaebdc73b0701bd43354835" Apr 06 12:00:15 crc kubenswrapper[4790]: I0406 12:00:15.295340 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-crc" Apr 06 12:00:17 crc kubenswrapper[4790]: E0406 12:00:17.013550 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a: Get \"https://registry.redhat.io/v2/openshift4/ose-cli/blobs/sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a\": context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Apr 06 12:00:17 crc kubenswrapper[4790]: E0406 12:00:17.013873 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Apr 06 12:00:17 crc kubenswrapper[4790]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Apr 06 12:00:17 crc kubenswrapper[4790]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-npzzc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29591280-h6r65_openshift-infra(8abd742f-e504-47d0-ab97-5befd3609dd7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a: Get "https://registry.redhat.io/v2/openshift4/ose-cli/blobs/sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a": context canceled Apr 06 12:00:17 crc kubenswrapper[4790]: > logger="UnhandledError" Apr 06 12:00:17 crc kubenswrapper[4790]: E0406 12:00:17.015109 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a: Get \\\"https://registry.redhat.io/v2/openshift4/ose-cli/blobs/sha256:c9407e328b1d01d4d7928cf2eadff86ef6a12d8963a31654b8470ed13ef4801a\\\": context canceled\"" pod="openshift-infra/auto-csr-approver-29591280-h6r65" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" Apr 06 12:00:17 crc kubenswrapper[4790]: E0406 12:00:17.308735 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29591280-h6r65" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.047221 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.047462 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" podUID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" containerName="controller-manager" containerID="cri-o://d7369a0ea7c9fdde8eba69f5d42cb2cd66c949ce479a9e6cc89e71c9c592ee5c" gracePeriod=30 Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.131479 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.131682 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" podUID="bb925f43-3504-488a-a2be-a8957067b40c" containerName="route-controller-manager" containerID="cri-o://661e3b45ced01bc8f6b958d333b4e25397e2de1a8726e4eb6c2ecff796cf84f3" gracePeriod=30 Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.314119 4790 generic.go:334] "Generic (PLEG): container finished" podID="bb925f43-3504-488a-a2be-a8957067b40c" containerID="661e3b45ced01bc8f6b958d333b4e25397e2de1a8726e4eb6c2ecff796cf84f3" exitCode=0 Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.314182 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" event={"ID":"bb925f43-3504-488a-a2be-a8957067b40c","Type":"ContainerDied","Data":"661e3b45ced01bc8f6b958d333b4e25397e2de1a8726e4eb6c2ecff796cf84f3"} Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.315393 4790 generic.go:334] "Generic (PLEG): container finished" podID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" containerID="d7369a0ea7c9fdde8eba69f5d42cb2cd66c949ce479a9e6cc89e71c9c592ee5c" exitCode=0 Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.315417 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" event={"ID":"964c6e60-51e3-487b-91e8-b3ecc21ccdd8","Type":"ContainerDied","Data":"d7369a0ea7c9fdde8eba69f5d42cb2cd66c949ce479a9e6cc89e71c9c592ee5c"} Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.600363 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.672382 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.695740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca\") pod \"bb925f43-3504-488a-a2be-a8957067b40c\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.695790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert\") pod \"bb925f43-3504-488a-a2be-a8957067b40c\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.695881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxrsz\" (UniqueName: \"kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz\") pod \"bb925f43-3504-488a-a2be-a8957067b40c\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.695924 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config\") pod \"bb925f43-3504-488a-a2be-a8957067b40c\" (UID: \"bb925f43-3504-488a-a2be-a8957067b40c\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.696697 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb925f43-3504-488a-a2be-a8957067b40c" (UID: "bb925f43-3504-488a-a2be-a8957067b40c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.696712 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config" (OuterVolumeSpecName: "config") pod "bb925f43-3504-488a-a2be-a8957067b40c" (UID: "bb925f43-3504-488a-a2be-a8957067b40c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.700821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz" (OuterVolumeSpecName: "kube-api-access-xxrsz") pod "bb925f43-3504-488a-a2be-a8957067b40c" (UID: "bb925f43-3504-488a-a2be-a8957067b40c"). InnerVolumeSpecName "kube-api-access-xxrsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.701639 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb925f43-3504-488a-a2be-a8957067b40c" (UID: "bb925f43-3504-488a-a2be-a8957067b40c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797033 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca\") pod \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797090 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert\") pod \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797121 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-224kk\" (UniqueName: \"kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk\") pod \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797158 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config\") pod \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797175 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles\") pod \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\" (UID: \"964c6e60-51e3-487b-91e8-b3ecc21ccdd8\") " Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797435 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxrsz\" (UniqueName: \"kubernetes.io/projected/bb925f43-3504-488a-a2be-a8957067b40c-kube-api-access-xxrsz\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797446 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797455 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb925f43-3504-488a-a2be-a8957067b40c-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.797465 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb925f43-3504-488a-a2be-a8957067b40c-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.798309 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "964c6e60-51e3-487b-91e8-b3ecc21ccdd8" (UID: "964c6e60-51e3-487b-91e8-b3ecc21ccdd8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.798373 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config" (OuterVolumeSpecName: "config") pod "964c6e60-51e3-487b-91e8-b3ecc21ccdd8" (UID: "964c6e60-51e3-487b-91e8-b3ecc21ccdd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.798580 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca" (OuterVolumeSpecName: "client-ca") pod "964c6e60-51e3-487b-91e8-b3ecc21ccdd8" (UID: "964c6e60-51e3-487b-91e8-b3ecc21ccdd8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.800432 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "964c6e60-51e3-487b-91e8-b3ecc21ccdd8" (UID: "964c6e60-51e3-487b-91e8-b3ecc21ccdd8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.801076 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk" (OuterVolumeSpecName: "kube-api-access-224kk") pod "964c6e60-51e3-487b-91e8-b3ecc21ccdd8" (UID: "964c6e60-51e3-487b-91e8-b3ecc21ccdd8"). InnerVolumeSpecName "kube-api-access-224kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.899300 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-client-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.899334 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.899344 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-224kk\" (UniqueName: \"kubernetes.io/projected/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-kube-api-access-224kk\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.899355 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:18 crc kubenswrapper[4790]: I0406 12:00:18.899364 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/964c6e60-51e3-487b-91e8-b3ecc21ccdd8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.332895 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" event={"ID":"964c6e60-51e3-487b-91e8-b3ecc21ccdd8","Type":"ContainerDied","Data":"9dc3d9446a40ba535293d7ef79aab50b7f568ba66481ead7fae0537436f7c5d8"} Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.332949 4790 scope.go:117] "RemoveContainer" containerID="d7369a0ea7c9fdde8eba69f5d42cb2cd66c949ce479a9e6cc89e71c9c592ee5c" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.333058 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc46b5549-6nr5f" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.341116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" event={"ID":"bb925f43-3504-488a-a2be-a8957067b40c","Type":"ContainerDied","Data":"0e7848b3b04da3ec6ed84e791c044cab1046f9e6750a7ae35dee11e6fcf4d688"} Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.341181 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.365357 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.369637 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cc46b5549-6nr5f"] Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.374255 4790 scope.go:117] "RemoveContainer" containerID="661e3b45ced01bc8f6b958d333b4e25397e2de1a8726e4eb6c2ecff796cf84f3" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.383536 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.386358 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59fc5646d-7nzmc"] Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.684800 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" path="/var/lib/kubelet/pods/964c6e60-51e3-487b-91e8-b3ecc21ccdd8/volumes" Apr 06 12:00:19 crc kubenswrapper[4790]: I0406 12:00:19.685427 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb925f43-3504-488a-a2be-a8957067b40c" path="/var/lib/kubelet/pods/bb925f43-3504-488a-a2be-a8957067b40c/volumes" Apr 06 12:00:21 crc kubenswrapper[4790]: I0406 12:00:21.974680 4790 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-prb62 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Apr 06 12:00:21 crc kubenswrapper[4790]: I0406 12:00:21.975163 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Apr 06 12:00:21 crc kubenswrapper[4790]: I0406 12:00:21.974756 4790 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-prb62 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Apr 06 12:00:21 crc kubenswrapper[4790]: I0406 12:00:21.975315 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Apr 06 12:00:22 crc kubenswrapper[4790]: I0406 12:00:22.368592 4790 generic.go:334] "Generic (PLEG): container finished" podID="0360c312-3ecf-42c9-9af9-470c231eefbd" containerID="9677322186c8c5cb15b094b9ba08a52db11ff46b1b90896f9e5ae01e60443c62" exitCode=0 Apr 06 12:00:22 crc kubenswrapper[4790]: I0406 12:00:22.368673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" event={"ID":"0360c312-3ecf-42c9-9af9-470c231eefbd","Type":"ContainerDied","Data":"9677322186c8c5cb15b094b9ba08a52db11ff46b1b90896f9e5ae01e60443c62"} Apr 06 12:00:22 crc kubenswrapper[4790]: I0406 12:00:22.369718 4790 scope.go:117] "RemoveContainer" containerID="9677322186c8c5cb15b094b9ba08a52db11ff46b1b90896f9e5ae01e60443c62" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.377876 4790 generic.go:334] "Generic (PLEG): container finished" podID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" containerID="1a02312babffbb9df18a9b40a384fba8d3e07f09e1e96ba9026d32410ce19dc8" exitCode=0 Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.377945 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" event={"ID":"bfec50d0-fe3d-45f4-afdb-cba2fc415878","Type":"ContainerDied","Data":"1a02312babffbb9df18a9b40a384fba8d3e07f09e1e96ba9026d32410ce19dc8"} Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.378652 4790 scope.go:117] "RemoveContainer" containerID="1a02312babffbb9df18a9b40a384fba8d3e07f09e1e96ba9026d32410ce19dc8" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.383212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" event={"ID":"0360c312-3ecf-42c9-9af9-470c231eefbd","Type":"ContainerStarted","Data":"979c219abba86d73464d91e900013b1b8abbce3c6236c2ca2cd19ca2a28b7a46"} Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.383506 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 12:00:23 crc kubenswrapper[4790]: E0406 12:00:23.563186 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b20da9_8cdd_4614_9c0a_9db7287856cd.slice/crio-conmon-1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.675390 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.687825 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="8e76c218-49ac-4a20-b6bb-11e8e95d20cb" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.687894 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="8e76c218-49ac-4a20-b6bb-11e8e95d20cb" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.695522 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.705741 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.707525 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.709977 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:00:23 crc kubenswrapper[4790]: I0406 12:00:23.712191 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279233 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.279699 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb925f43-3504-488a-a2be-a8957067b40c" containerName="route-controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279711 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb925f43-3504-488a-a2be-a8957067b40c" containerName="route-controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.279726 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e458c724-3942-4d11-80fb-e42973fb2b28" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279733 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e458c724-3942-4d11-80fb-e42973fb2b28" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.279746 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" containerName="controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279753 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" containerName="controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.279766 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279771 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279881 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e458c724-3942-4d11-80fb-e42973fb2b28" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279894 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed2aa0a-1e35-4436-b184-a680dcfd14df" containerName="installer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279904 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb925f43-3504-488a-a2be-a8957067b40c" containerName="route-controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.279911 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="964c6e60-51e3-487b-91e8-b3ecc21ccdd8" containerName="controller-manager" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280219 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280446 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31" gracePeriod=15 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280486 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556" gracePeriod=15 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382" gracePeriod=15 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280884 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547" gracePeriod=15 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.280922 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b" gracePeriod=15 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285501 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285802 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285816 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285839 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285846 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285855 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285861 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285871 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285877 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285891 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285897 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285906 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285913 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285922 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285927 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285934 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285940 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.285950 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.285955 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286057 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286067 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286076 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286082 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286090 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286099 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286105 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286111 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286121 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.286214 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.286221 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.326906 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.381779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384439 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384548 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384666 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.384927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.385128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.392457 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" event={"ID":"bfec50d0-fe3d-45f4-afdb-cba2fc415878","Type":"ContainerStarted","Data":"74948ad3441a5d945d4797a95f5941e7eee9011fd6d3dc7a106ba64e9becb3ff"} Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.394120 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.394581 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.395647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"504c30baf03eb53a493e674554abcc5229db2f06840a0877d379361429a5cde2"} Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.395678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"7d9079c674d04974dfecf9717c58fab50170503deb7b33c8e7d8e10d80c10e38"} Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.395746 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.396043 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.396526 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.399576 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" containerID="1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325" exitCode=0 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.399872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerDied","Data":"1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325"} Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.400485 4790 scope.go:117] "RemoveContainer" containerID="1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.400666 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.401617 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.402372 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.402926 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.405018 4790 generic.go:334] "Generic (PLEG): container finished" podID="3851471a-4968-40c6-9be1-9ba072ddf741" containerID="3d7b8c4ff25646700d78ac43f3a8ddeb965e16df9c7ff06db3c54b2afc53b930" exitCode=0 Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.405245 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerDied","Data":"3d7b8c4ff25646700d78ac43f3a8ddeb965e16df9c7ff06db3c54b2afc53b930"} Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.405950 4790 scope.go:117] "RemoveContainer" containerID="3d7b8c4ff25646700d78ac43f3a8ddeb965e16df9c7ff06db3c54b2afc53b930" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.406426 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.409948 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.411069 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.411438 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.414979 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:24 crc kubenswrapper[4790]: E0406 12:00:24.421390 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/events/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq.18a3c2af693c68aa\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{openshift-kube-scheduler-operator-5fdd9b5758-s2cfq.18a3c2af693c68aa openshift-kube-scheduler-operator 27568 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler-operator,Name:openshift-kube-scheduler-operator-5fdd9b5758-s2cfq,UID:f7b20da9-8cdd-4614-9c0a-9db7287856cd,APIVersion:v1,ResourceVersion:27169,FieldPath:spec.containers{kube-scheduler-operator-container},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:58:56 +0000 UTC,LastTimestamp:2026-04-06 12:00:24.419609478 +0000 UTC m=+203.407352354,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.486844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.486948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.487364 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.488241 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.488479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.488526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.488895 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.489189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.489217 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.489456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.489500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: I0406 12:00:24.628284 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:24 crc kubenswrapper[4790]: W0406 12:00:24.650347 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5a1d46b1630e1d3c42e7d916eaef1b3aac668cdb9679d4f6da7ec0b6fb76a17d WatchSource:0}: Error finding container 5a1d46b1630e1d3c42e7d916eaef1b3aac668cdb9679d4f6da7ec0b6fb76a17d: Status 404 returned error can't find the container with id 5a1d46b1630e1d3c42e7d916eaef1b3aac668cdb9679d4f6da7ec0b6fb76a17d Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.412855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerStarted","Data":"0c759347231b9d0d3fd03bdcbfcdf3ef8db30f41b34a2244d09ab10d2042cc5b"} Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.413891 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.414144 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.414332 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.414578 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.414714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerStarted","Data":"2b7a8ea9b6c5a4a9fba55035c4ec5353a49f058524d883e7a74b5926654d46ed"} Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.414937 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.415244 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.415467 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.415669 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.415893 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.416116 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.416437 4790 generic.go:334] "Generic (PLEG): container finished" podID="804d6679-7288-40ec-853e-345cf118c657" containerID="b1ee3c31cfe2be7da725700eb5fb3726425e8a36157526c1852526ee9aa3e340" exitCode=0 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.416486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"804d6679-7288-40ec-853e-345cf118c657","Type":"ContainerDied","Data":"b1ee3c31cfe2be7da725700eb5fb3726425e8a36157526c1852526ee9aa3e340"} Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.416755 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.416966 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417172 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417388 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417625 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417796 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417953 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3b08142b4e69f4f0b6d28980fcaeb0481f5a5edbb98efd67a056ceeaa56cc824"} Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.417986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5a1d46b1630e1d3c42e7d916eaef1b3aac668cdb9679d4f6da7ec0b6fb76a17d"} Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.418288 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: E0406 12:00:25.418412 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.418518 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.418780 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.418990 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.419196 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.419402 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.419935 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.421274 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.421938 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382" exitCode=0 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.421957 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b" exitCode=0 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.421968 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556" exitCode=0 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.421979 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547" exitCode=2 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.422104 4790 scope.go:117] "RemoveContainer" containerID="e6224db2cc7254075caa631d8da7467f0dcf0483a05a6236d99fa0b07232cd4b" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.446114 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-965d55b94-4rql6" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" containerName="registry" containerID="cri-o://d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f" gracePeriod=30 Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.882702 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.883650 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.883818 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.884142 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.884729 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.885349 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:25 crc kubenswrapper[4790]: I0406 12:00:25.885645 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005161 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmvtd\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005237 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005314 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005382 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005485 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.005504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets\") pod \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\" (UID: \"1aed6119-fbb5-4557-94ee-7c3e86fc3002\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.006163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.006183 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.011295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd" (OuterVolumeSpecName: "kube-api-access-jmvtd") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "kube-api-access-jmvtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.013116 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.014919 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.014985 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.020534 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.027502 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1aed6119-fbb5-4557-94ee-7c3e86fc3002" (UID: "1aed6119-fbb5-4557-94ee-7c3e86fc3002"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107402 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1aed6119-fbb5-4557-94ee-7c3e86fc3002-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107443 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107457 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107470 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1aed6119-fbb5-4557-94ee-7c3e86fc3002-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107485 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107496 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmvtd\" (UniqueName: \"kubernetes.io/projected/1aed6119-fbb5-4557-94ee-7c3e86fc3002-kube-api-access-jmvtd\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.107506 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1aed6119-fbb5-4557-94ee-7c3e86fc3002-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.431934 4790 generic.go:334] "Generic (PLEG): container finished" podID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" containerID="d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f" exitCode=0 Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.432012 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-965d55b94-4rql6" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.432000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-965d55b94-4rql6" event={"ID":"1aed6119-fbb5-4557-94ee-7c3e86fc3002","Type":"ContainerDied","Data":"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f"} Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.432671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-965d55b94-4rql6" event={"ID":"1aed6119-fbb5-4557-94ee-7c3e86fc3002","Type":"ContainerDied","Data":"a8fcb01c12d9b7c8b5fb5f5259c1b25f9b6afdd1d9349e0d5df4465bc99b37d3"} Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.432709 4790 scope.go:117] "RemoveContainer" containerID="d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.433182 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.433344 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.433488 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.433649 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.433902 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.434310 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.436919 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.440688 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-vh9v9_f0566848-e645-46bd-8e5e-fddcde1248ba/console-operator/0.log" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.440748 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerID="9bed722ea06b6b94cd6e52b7cc3fad9cba84bbca1ac99e9be694c0475736ca82" exitCode=1 Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.441038 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerDied","Data":"9bed722ea06b6b94cd6e52b7cc3fad9cba84bbca1ac99e9be694c0475736ca82"} Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.441728 4790 scope.go:117] "RemoveContainer" containerID="9bed722ea06b6b94cd6e52b7cc3fad9cba84bbca1ac99e9be694c0475736ca82" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.444462 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.444877 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.445946 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.446305 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.446714 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.447582 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.447996 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.462940 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463119 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463254 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463404 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463538 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463672 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.463815 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.482360 4790 scope.go:117] "RemoveContainer" containerID="d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f" Apr 06 12:00:26 crc kubenswrapper[4790]: E0406 12:00:26.491286 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f\": container with ID starting with d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f not found: ID does not exist" containerID="d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.491326 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f"} err="failed to get container status \"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f\": rpc error: code = NotFound desc = could not find container \"d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f\": container with ID starting with d499341a229d56bd42e554a1261dc97a0d5a86619d930a4afb08a86000e5cd5f not found: ID does not exist" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.632128 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.633250 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.633731 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.633972 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.634172 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.634372 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.634606 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.634772 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.634985 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.635150 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.655963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.656814 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.657098 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.657292 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.657479 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.657653 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.657844 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.658018 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.658198 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717413 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717423 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir\") pod \"804d6679-7288-40ec-853e-345cf118c657\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717452 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "804d6679-7288-40ec-853e-345cf118c657" (UID: "804d6679-7288-40ec-853e-345cf118c657"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717496 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock\") pod \"804d6679-7288-40ec-853e-345cf118c657\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717540 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access\") pod \"804d6679-7288-40ec-853e-345cf118c657\" (UID: \"804d6679-7288-40ec-853e-345cf118c657\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock" (OuterVolumeSpecName: "var-lock") pod "804d6679-7288-40ec-853e-345cf118c657" (UID: "804d6679-7288-40ec-853e-345cf118c657"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717606 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.717707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.718281 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.719054 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.719467 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.719485 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.719495 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/804d6679-7288-40ec-853e-345cf118c657-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.719504 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.724075 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "804d6679-7288-40ec-853e-345cf118c657" (UID: "804d6679-7288-40ec-853e-345cf118c657"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:26 crc kubenswrapper[4790]: I0406 12:00:26.821033 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/804d6679-7288-40ec-853e-345cf118c657-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.063537 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dkg9c" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" containerID="cri-o://11b5938ae906f39a81f798060020ac636b9cb07526c4328d7f11f2eff7143e30" gracePeriod=15 Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.452298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"804d6679-7288-40ec-853e-345cf118c657","Type":"ContainerDied","Data":"3a362273248972e6cd4b92c064f0186c29146fe85b5e664f4fa3a63fbaa77988"} Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.452593 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a362273248972e6cd4b92c064f0186c29146fe85b5e664f4fa3a63fbaa77988" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.452664 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.457016 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dkg9c_18df9b4d-88b9-46d0-adf8-90072301374e/console/0.log" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.457131 4790 generic.go:334] "Generic (PLEG): container finished" podID="18df9b4d-88b9-46d0-adf8-90072301374e" containerID="11b5938ae906f39a81f798060020ac636b9cb07526c4328d7f11f2eff7143e30" exitCode=2 Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.457197 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dkg9c" event={"ID":"18df9b4d-88b9-46d0-adf8-90072301374e","Type":"ContainerDied","Data":"11b5938ae906f39a81f798060020ac636b9cb07526c4328d7f11f2eff7143e30"} Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.462674 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.463671 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31" exitCode=0 Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.463758 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.463771 4790 scope.go:117] "RemoveContainer" containerID="e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.466217 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-vh9v9_f0566848-e645-46bd-8e5e-fddcde1248ba/console-operator/0.log" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.466309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerStarted","Data":"440b5692b9ce8088053c067135ef67be6e8abc4cda3368ab4255830624b0698d"} Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.466692 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.467086 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.467615 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.468003 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.468230 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.469120 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.470305 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.470575 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.470912 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.471501 4790 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="504c30baf03eb53a493e674554abcc5229db2f06840a0877d379361429a5cde2" exitCode=0 Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.471577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerDied","Data":"504c30baf03eb53a493e674554abcc5229db2f06840a0877d379361429a5cde2"} Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.472612 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.474037 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.474510 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.474578 4790 generic.go:334] "Generic (PLEG): container finished" podID="743439e2-53c8-4cda-b960-2448c1fb2941" containerID="592ebd557120949c2f5a37455001497e7b579a32a7bfcce6df3f88469210b4c7" exitCode=0 Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.474603 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerDied","Data":"592ebd557120949c2f5a37455001497e7b579a32a7bfcce6df3f88469210b4c7"} Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.475248 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.475584 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.475711 4790 scope.go:117] "RemoveContainer" containerID="592ebd557120949c2f5a37455001497e7b579a32a7bfcce6df3f88469210b4c7" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.475873 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.476246 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.476901 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.478566 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.479371 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.479646 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.480024 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.482056 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.483225 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.483559 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.484697 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.485543 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.541443 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dkg9c_18df9b4d-88b9-46d0-adf8-90072301374e/console/0.log" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.541510 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.542039 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.542428 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.542891 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.543121 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.543312 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.543565 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.543815 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.544052 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.544251 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.544492 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.547056 4790 scope.go:117] "RemoveContainer" containerID="cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.564854 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565045 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565192 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565327 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565504 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565638 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565776 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.565940 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566081 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566223 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566401 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566559 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566708 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.566885 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.567042 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.567188 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.567458 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.567605 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.567859 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.568018 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.573162 4790 scope.go:117] "RemoveContainer" containerID="c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.594558 4790 scope.go:117] "RemoveContainer" containerID="9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.613339 4790 scope.go:117] "RemoveContainer" containerID="24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.633189 4790 scope.go:117] "RemoveContainer" containerID="64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634269 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88lnc\" (UniqueName: \"kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634294 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634309 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.634947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config\") pod \"18df9b4d-88b9-46d0-adf8-90072301374e\" (UID: \"18df9b4d-88b9-46d0-adf8-90072301374e\") " Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635136 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635165 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca" (OuterVolumeSpecName: "service-ca") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635266 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config" (OuterVolumeSpecName: "console-config") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635567 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635587 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-console-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.635597 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.636058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.643915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc" (OuterVolumeSpecName: "kube-api-access-88lnc") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "kube-api-access-88lnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.644038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.644080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "18df9b4d-88b9-46d0-adf8-90072301374e" (UID: "18df9b4d-88b9-46d0-adf8-90072301374e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.659630 4790 scope.go:117] "RemoveContainer" containerID="e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.660049 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382\": container with ID starting with e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382 not found: ID does not exist" containerID="e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660076 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382"} err="failed to get container status \"e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382\": rpc error: code = NotFound desc = could not find container \"e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382\": container with ID starting with e9aad00153e278b6f37e60029f6be3d297b5319c288bb6ebf91d3037d8f15382 not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660099 4790 scope.go:117] "RemoveContainer" containerID="cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.660516 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b\": container with ID starting with cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b not found: ID does not exist" containerID="cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660539 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b"} err="failed to get container status \"cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b\": rpc error: code = NotFound desc = could not find container \"cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b\": container with ID starting with cdf7a3d1c5589a9e64ef1fd65cf6ed2acb8b0516211493e3b3f165baeefbb34b not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660552 4790 scope.go:117] "RemoveContainer" containerID="c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.660747 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556\": container with ID starting with c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556 not found: ID does not exist" containerID="c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660770 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556"} err="failed to get container status \"c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556\": rpc error: code = NotFound desc = could not find container \"c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556\": container with ID starting with c0daceb37a2d566e46d5b328928d7d33deb2ab21a100e5c8772652ac033af556 not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.660782 4790 scope.go:117] "RemoveContainer" containerID="9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.661221 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547\": container with ID starting with 9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547 not found: ID does not exist" containerID="9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.661246 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547"} err="failed to get container status \"9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547\": rpc error: code = NotFound desc = could not find container \"9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547\": container with ID starting with 9ae6bd024abaa7c07bcbe388da96282ad0bde13566906b9a230225cd06623547 not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.661265 4790 scope.go:117] "RemoveContainer" containerID="24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.661591 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31\": container with ID starting with 24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31 not found: ID does not exist" containerID="24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.661621 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31"} err="failed to get container status \"24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31\": rpc error: code = NotFound desc = could not find container \"24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31\": container with ID starting with 24bc576f9fb36d6605babe769455de5838c74aab14a28cc391211e7ad6432e31 not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.661638 4790 scope.go:117] "RemoveContainer" containerID="64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5" Apr 06 12:00:27 crc kubenswrapper[4790]: E0406 12:00:27.661931 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5\": container with ID starting with 64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5 not found: ID does not exist" containerID="64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.661957 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5"} err="failed to get container status \"64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5\": rpc error: code = NotFound desc = could not find container \"64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5\": container with ID starting with 64c2df7050d41fffa7b054328c75ade50ab4344ad18f064b86bc7764768b82c5 not found: ID does not exist" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.684599 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.737514 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.737610 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/18df9b4d-88b9-46d0-adf8-90072301374e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.737626 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88lnc\" (UniqueName: \"kubernetes.io/projected/18df9b4d-88b9-46d0-adf8-90072301374e-kube-api-access-88lnc\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.737640 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18df9b4d-88b9-46d0-adf8-90072301374e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.977602 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.978068 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.978429 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.978978 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.979283 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.979547 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.979849 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.980116 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.980389 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.980681 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:27 crc kubenswrapper[4790]: I0406 12:00:27.980980 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.466894 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.466969 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.483596 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dkg9c_18df9b4d-88b9-46d0-adf8-90072301374e/console/0.log" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.483705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dkg9c" event={"ID":"18df9b4d-88b9-46d0-adf8-90072301374e","Type":"ContainerDied","Data":"8ed42ba749ae266af6dfa9c8f417d83551d52c2746fe68c07a9503f6d99caa91"} Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.483724 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dkg9c" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.483754 4790 scope.go:117] "RemoveContainer" containerID="11b5938ae906f39a81f798060020ac636b9cb07526c4328d7f11f2eff7143e30" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.484432 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.484761 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.485405 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.485695 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.486219 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.486693 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.487234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"efb99e3c83fa8b086e34268cf2ed4f95342b9e349fd28152e399c332bc18f5ff"} Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.487290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"2a0bda650d85589544b66809493d1f094798ba4128af4572a641b923de6a1128"} Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.487305 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"815516d0756bb9282f4d0a28cef72670","Type":"ContainerStarted","Data":"642a8714da2ce927c59666c6a55e69c883f9283af786c866dcf1219b7c295bf8"} Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.487668 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.487861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.488193 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.488561 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.488812 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.489069 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.489533 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.489898 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.490155 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.490406 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.490871 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.491179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerStarted","Data":"9419e23c42622ac01c5d40d7d22106d4a2f220c9146af998bf59f99a38ce3394"} Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.491752 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.492178 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.492784 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.493199 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.493652 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.494040 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.494536 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.495038 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.495416 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.495892 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.496209 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.496564 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.497141 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:28 crc kubenswrapper[4790]: I0406 12:00:28.497616 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.491753 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.493202 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.501774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" event={"ID":"6e5a800d-f198-4dbe-8c7b-a84e6c130041","Type":"ContainerDied","Data":"021ccd9cef3460f403934bc81099fd3d5f5ddd2d4a754e3ae24e44c78818559c"} Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.502192 4790 generic.go:334] "Generic (PLEG): container finished" podID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" containerID="021ccd9cef3460f403934bc81099fd3d5f5ddd2d4a754e3ae24e44c78818559c" exitCode=0 Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.502658 4790 scope.go:117] "RemoveContainer" containerID="021ccd9cef3460f403934bc81099fd3d5f5ddd2d4a754e3ae24e44c78818559c" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.502956 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.503438 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.503951 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.504341 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.504608 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.505041 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.505660 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.506076 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.506419 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.506881 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:29 crc kubenswrapper[4790]: I0406 12:00:29.507231 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.515668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" event={"ID":"6e5a800d-f198-4dbe-8c7b-a84e6c130041","Type":"ContainerStarted","Data":"a9c4e35d6fb059bc517ecd4f48f3b737191605d45f4eba7cc2a65d7536aa8852"} Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.517402 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.517971 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.518490 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.519213 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.519748 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb424d82-17ec-4515-903c-96156334ca08" containerID="a6590d7613bda71a20ff4999cb28a7b04b38d65e19773b49346f5c4b457fccc8" exitCode=0 Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.519825 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.519878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerDied","Data":"a6590d7613bda71a20ff4999cb28a7b04b38d65e19773b49346f5c4b457fccc8"} Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.520551 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.521274 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.521919 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.522131 4790 scope.go:117] "RemoveContainer" containerID="a6590d7613bda71a20ff4999cb28a7b04b38d65e19773b49346f5c4b457fccc8" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.523465 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.526337 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.526957 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.527754 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.528406 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.528922 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.529609 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.529847 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.530159 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.530558 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.530794 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.531198 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.531525 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.531852 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:30 crc kubenswrapper[4790]: I0406 12:00:30.532250 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.533380 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerStarted","Data":"1013b72c940929f2925bda0d66e264edd3a45cdf5620d4b1c5f07d0e3734cd01"} Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.534997 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.536170 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.536657 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.537371 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.538022 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.538442 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.539039 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.539658 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.540191 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.540765 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.541459 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.542033 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.681165 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.681977 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.683675 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.684233 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.684734 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.685413 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.686051 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.690300 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.691142 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.691567 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.691912 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.692263 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.692737 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.693214 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.693702 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.694329 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.695071 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.695662 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.696237 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.696798 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.697465 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.698113 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.698774 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.699420 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:31 crc kubenswrapper[4790]: I0406 12:00:31.699987 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.544095 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" containerID="f6b6519b3d576b511eb396de9952044242ecd524164b0455623a7bc0b6f3a576" exitCode=0 Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.544222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerDied","Data":"f6b6519b3d576b511eb396de9952044242ecd524164b0455623a7bc0b6f3a576"} Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.544883 4790 scope.go:117] "RemoveContainer" containerID="f6b6519b3d576b511eb396de9952044242ecd524164b0455623a7bc0b6f3a576" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.545709 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.546194 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.546406 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.546595 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.546865 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547174 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547325 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547461 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547658 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547838 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.547985 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.549002 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.549389 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.550048 4790 generic.go:334] "Generic (PLEG): container finished" podID="fec884c6-276b-4230-a41c-3375cbc2104b" containerID="8dcb032fb28181c105a1909b65757b322371120de591936c9a842adaae4cf508" exitCode=0 Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.550067 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.550117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerDied","Data":"8dcb032fb28181c105a1909b65757b322371120de591936c9a842adaae4cf508"} Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.550448 4790 scope.go:117] "RemoveContainer" containerID="8dcb032fb28181c105a1909b65757b322371120de591936c9a842adaae4cf508" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.550766 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.551070 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.551478 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.551785 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.552325 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.552715 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.553089 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.553559 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.553882 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.554168 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.554572 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.554886 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.555167 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.555869 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:32 crc kubenswrapper[4790]: I0406 12:00:32.556214 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.116716 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/events/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq.18a3c2af693c68aa\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{openshift-kube-scheduler-operator-5fdd9b5758-s2cfq.18a3c2af693c68aa openshift-kube-scheduler-operator 27568 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler-operator,Name:openshift-kube-scheduler-operator-5fdd9b5758-s2cfq,UID:f7b20da9-8cdd-4614-9c0a-9db7287856cd,APIVersion:v1,ResourceVersion:27169,FieldPath:spec.containers{kube-scheduler-operator-container},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 11:58:56 +0000 UTC,LastTimestamp:2026-04-06 12:00:24.419609478 +0000 UTC m=+203.407352354,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.557608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerStarted","Data":"76cbd6897462a4de9ada25c93a8031e5ac8aa105a7e457f4d31acad2f59f00e5"} Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.558239 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.558473 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.558730 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.558918 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559167 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559377 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559613 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559787 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerStarted","Data":"2c266cb742c88a644af59ebc20bd72c6875453ba55650a36c227ba4f0f237d61"} Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.559953 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.560142 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.560365 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.560596 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.560790 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.560963 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561097 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561334 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561487 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561639 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561786 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.561953 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562097 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562238 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562380 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562515 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562656 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.562797 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.563004 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.563186 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.563368 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.563538 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.704983 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.705507 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.705957 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.706243 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.706507 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:33 crc kubenswrapper[4790]: I0406 12:00:33.706539 4790 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.706793 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Apr 06 12:00:33 crc kubenswrapper[4790]: E0406 12:00:33.907898 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Apr 06 12:00:34 crc kubenswrapper[4790]: E0406 12:00:34.308868 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Apr 06 12:00:35 crc kubenswrapper[4790]: E0406 12:00:35.109961 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.575006 4790 generic.go:334] "Generic (PLEG): container finished" podID="8abd742f-e504-47d0-ab97-5befd3609dd7" containerID="243ef026b58efc0cd530c2acc633b25e73f359da87ba4d52c43e99750f1767e1" exitCode=0 Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.575079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591280-h6r65" event={"ID":"8abd742f-e504-47d0-ab97-5befd3609dd7","Type":"ContainerDied","Data":"243ef026b58efc0cd530c2acc633b25e73f359da87ba4d52c43e99750f1767e1"} Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.576646 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.577618 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.578117 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.578916 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.579585 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.580036 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.580697 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.581328 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.582101 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.582938 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.583465 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.584066 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.584872 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.585481 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.586019 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.926130 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.926142 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.926315 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:35 crc kubenswrapper[4790]: I0406 12:00:35.926211 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.675120 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.676479 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.676701 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.676939 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.677186 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.677379 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.677568 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.677771 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.677996 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.678185 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.678375 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.678583 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.679150 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.679368 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.679566 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.679788 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.699768 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.699805 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:36 crc kubenswrapper[4790]: E0406 12:00:36.700472 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.701240 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:36 crc kubenswrapper[4790]: E0406 12:00:36.711738 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.868619 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.869239 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.869767 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.870119 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.870430 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.870730 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.871120 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.871387 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.871645 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.871994 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.872256 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.872542 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.872892 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.873267 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.873592 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.873981 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.977040 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npzzc\" (UniqueName: \"kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc\") pod \"8abd742f-e504-47d0-ab97-5befd3609dd7\" (UID: \"8abd742f-e504-47d0-ab97-5befd3609dd7\") " Apr 06 12:00:36 crc kubenswrapper[4790]: I0406 12:00:36.983588 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc" (OuterVolumeSpecName: "kube-api-access-npzzc") pod "8abd742f-e504-47d0-ab97-5befd3609dd7" (UID: "8abd742f-e504-47d0-ab97-5befd3609dd7"). InnerVolumeSpecName "kube-api-access-npzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.079426 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npzzc\" (UniqueName: \"kubernetes.io/projected/8abd742f-e504-47d0-ab97-5befd3609dd7-kube-api-access-npzzc\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.590609 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949" exitCode=0 Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.590669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949"} Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.590694 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd3e5d099480df3fb2a5c855f01435904bdb252409f1a2c23d2fc6e673fbb7e4"} Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.590980 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.590991 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:37 crc kubenswrapper[4790]: E0406 12:00:37.591352 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.591372 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.591646 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592098 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591280-h6r65" event={"ID":"8abd742f-e504-47d0-ab97-5befd3609dd7","Type":"ContainerDied","Data":"5fa93c3b9a69007fbaca83ddd43d83b305166828d57d1f9071acc6e8c634c44e"} Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592356 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa93c3b9a69007fbaca83ddd43d83b305166828d57d1f9071acc6e8c634c44e" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592375 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591280-h6r65" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592482 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.592735 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.593330 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.593654 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.594009 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.594347 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.594568 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.594823 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.595109 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.595304 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.595453 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.595595 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.612679 4790 status_manager.go:851] "Failed to get status for pod" podUID="815516d0756bb9282f4d0a28cef72670" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.613056 4790 status_manager.go:851] "Failed to get status for pod" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" pod="openshift-infra/auto-csr-approver-29591280-h6r65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29591280-h6r65\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.613499 4790 status_manager.go:851] "Failed to get status for pod" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" pod="openshift-console/console-f9d7485db-dkg9c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/pods/console-f9d7485db-dkg9c\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.613750 4790 status_manager.go:851] "Failed to get status for pod" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/pods/etcd-operator-b45778765-499ss\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.613971 4790 status_manager.go:851] "Failed to get status for pod" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" pod="openshift-image-registry/image-registry-965d55b94-4rql6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-965d55b94-4rql6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.614411 4790 status_manager.go:851] "Failed to get status for pod" podUID="804d6679-7288-40ec-853e-345cf118c657" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.614703 4790 status_manager.go:851] "Failed to get status for pod" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/pods/kube-controller-manager-operator-78b949d7b-rnq82\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.615027 4790 status_manager.go:851] "Failed to get status for pod" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/pods/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.615221 4790 status_manager.go:851] "Failed to get status for pod" podUID="0360c312-3ecf-42c9-9af9-470c231eefbd" pod="openshift-config-operator/openshift-config-operator-7777fb866f-prb62" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/pods/openshift-config-operator-7777fb866f-prb62\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.615455 4790 status_manager.go:851] "Failed to get status for pod" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/pods/openshift-apiserver-operator-796bbdcf4f-ct8gb\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.615693 4790 status_manager.go:851] "Failed to get status for pod" podUID="eb424d82-17ec-4515-903c-96156334ca08" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/pods/kube-storage-version-migrator-operator-b67b599dd-pwmq6\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.615969 4790 status_manager.go:851] "Failed to get status for pod" podUID="6e5a800d-f198-4dbe-8c7b-a84e6c130041" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fbq9z" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-fbq9z\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.616205 4790 status_manager.go:851] "Failed to get status for pod" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-vh9v9\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.616454 4790 status_manager.go:851] "Failed to get status for pod" podUID="bfec50d0-fe3d-45f4-afdb-cba2fc415878" pod="openshift-authentication-operator/authentication-operator-69f744f599-742t2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/pods/authentication-operator-69f744f599-742t2\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:37 crc kubenswrapper[4790]: I0406 12:00:37.616681 4790 status_manager.go:851] "Failed to get status for pod" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-jfjdg\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:00:38 crc kubenswrapper[4790]: I0406 12:00:38.602800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf"} Apr 06 12:00:38 crc kubenswrapper[4790]: I0406 12:00:38.603086 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e"} Apr 06 12:00:38 crc kubenswrapper[4790]: I0406 12:00:38.603096 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475"} Apr 06 12:00:38 crc kubenswrapper[4790]: I0406 12:00:38.603105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f"} Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.310699 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" podUID="0d86c525-b5ed-49c3-b95c-1e1075add472" containerName="oauth-openshift" containerID="cri-o://598b5477d8da6e675e051c6ca43cdfd29ae2ed6a27248c3c294dadc2b1925491" gracePeriod=15 Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.611627 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.612206 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.612255 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="14a5d345b8c42edbe390c0b15dd0f1df9a3b8bb13468ce448c5685d9ca370f6f" exitCode=1 Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.612316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"14a5d345b8c42edbe390c0b15dd0f1df9a3b8bb13468ce448c5685d9ca370f6f"} Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.612683 4790 scope.go:117] "RemoveContainer" containerID="14a5d345b8c42edbe390c0b15dd0f1df9a3b8bb13468ce448c5685d9ca370f6f" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.615638 4790 generic.go:334] "Generic (PLEG): container finished" podID="0d86c525-b5ed-49c3-b95c-1e1075add472" containerID="598b5477d8da6e675e051c6ca43cdfd29ae2ed6a27248c3c294dadc2b1925491" exitCode=0 Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.615759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" event={"ID":"0d86c525-b5ed-49c3-b95c-1e1075add472","Type":"ContainerDied","Data":"598b5477d8da6e675e051c6ca43cdfd29ae2ed6a27248c3c294dadc2b1925491"} Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.619811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b"} Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.620173 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.620258 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.620285 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.706492 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.753083 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.753151 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811264 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811316 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811346 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nq8g\" (UniqueName: \"kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811383 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811446 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811525 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811569 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.811596 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle\") pod \"0d86c525-b5ed-49c3-b95c-1e1075add472\" (UID: \"0d86c525-b5ed-49c3-b95c-1e1075add472\") " Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.812056 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.812246 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.812340 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.812486 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.823474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.832415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834020 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834086 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g" (OuterVolumeSpecName: "kube-api-access-4nq8g") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "kube-api-access-4nq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834115 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834147 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834160 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.834186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.846266 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0d86c525-b5ed-49c3-b95c-1e1075add472" (UID: "0d86c525-b5ed-49c3-b95c-1e1075add472"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913324 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913361 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-policies\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913375 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nq8g\" (UniqueName: \"kubernetes.io/projected/0d86c525-b5ed-49c3-b95c-1e1075add472-kube-api-access-4nq8g\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913386 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913395 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913409 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913420 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d86c525-b5ed-49c3-b95c-1e1075add472-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913429 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913440 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913448 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913456 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913465 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913474 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:39 crc kubenswrapper[4790]: I0406 12:00:39.913482 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d86c525-b5ed-49c3-b95c-1e1075add472-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.628756 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" event={"ID":"0d86c525-b5ed-49c3-b95c-1e1075add472","Type":"ContainerDied","Data":"82ca1975d88d479676cb5e2e199f21e80a606050429b8506b8e617eb514271fe"} Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.628789 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj" Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.628816 4790 scope.go:117] "RemoveContainer" containerID="598b5477d8da6e675e051c6ca43cdfd29ae2ed6a27248c3c294dadc2b1925491" Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.635681 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.636993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 06 12:00:40 crc kubenswrapper[4790]: I0406 12:00:40.637100 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c"} Apr 06 12:00:41 crc kubenswrapper[4790]: I0406 12:00:41.702182 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:41 crc kubenswrapper[4790]: I0406 12:00:41.702644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:41 crc kubenswrapper[4790]: I0406 12:00:41.712451 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:42 crc kubenswrapper[4790]: I0406 12:00:42.402790 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:00:42 crc kubenswrapper[4790]: I0406 12:00:42.403040 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Apr 06 12:00:42 crc kubenswrapper[4790]: I0406 12:00:42.403090 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Apr 06 12:00:43 crc kubenswrapper[4790]: I0406 12:00:43.904612 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:00:44 crc kubenswrapper[4790]: I0406 12:00:44.628552 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:44 crc kubenswrapper[4790]: I0406 12:00:44.663756 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:44 crc kubenswrapper[4790]: I0406 12:00:44.663784 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:44 crc kubenswrapper[4790]: I0406 12:00:44.677493 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:00:44 crc kubenswrapper[4790]: I0406 12:00:44.952439 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.671325 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.671360 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.679605 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.926050 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.926135 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.926275 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:45 crc kubenswrapper[4790]: I0406 12:00:45.926341 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:46 crc kubenswrapper[4790]: I0406 12:00:46.681880 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_a6523047-6809-4e97-8d1e-4c33d08aa1d6/installer/0.log" Apr 06 12:00:46 crc kubenswrapper[4790]: I0406 12:00:46.682240 4790 generic.go:334] "Generic (PLEG): container finished" podID="a6523047-6809-4e97-8d1e-4c33d08aa1d6" containerID="c4d7c6173eca80a218789d788426e3cbe8a4dd292a3cea6e1dd8ee24cb8e5437" exitCode=1 Apr 06 12:00:46 crc kubenswrapper[4790]: I0406 12:00:46.682281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"a6523047-6809-4e97-8d1e-4c33d08aa1d6","Type":"ContainerDied","Data":"c4d7c6173eca80a218789d788426e3cbe8a4dd292a3cea6e1dd8ee24cb8e5437"} Apr 06 12:00:47 crc kubenswrapper[4790]: I0406 12:00:47.948981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_a6523047-6809-4e97-8d1e-4c33d08aa1d6/installer/0.log" Apr 06 12:00:47 crc kubenswrapper[4790]: I0406 12:00:47.949378 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.024823 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock\") pod \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025056 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir\") pod \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock" (OuterVolumeSpecName: "var-lock") pod "a6523047-6809-4e97-8d1e-4c33d08aa1d6" (UID: "a6523047-6809-4e97-8d1e-4c33d08aa1d6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access\") pod \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\" (UID: \"a6523047-6809-4e97-8d1e-4c33d08aa1d6\") " Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025139 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a6523047-6809-4e97-8d1e-4c33d08aa1d6" (UID: "a6523047-6809-4e97-8d1e-4c33d08aa1d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025587 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.025621 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.033611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a6523047-6809-4e97-8d1e-4c33d08aa1d6" (UID: "a6523047-6809-4e97-8d1e-4c33d08aa1d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.127781 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6523047-6809-4e97-8d1e-4c33d08aa1d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.711046 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-11-crc_a6523047-6809-4e97-8d1e-4c33d08aa1d6/installer/0.log" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.711143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-crc" event={"ID":"a6523047-6809-4e97-8d1e-4c33d08aa1d6","Type":"ContainerDied","Data":"a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad"} Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.711184 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3e1991573c5a66c3c025e87aee6067970e716b85bbb4e8ec82d66ffd98842ad" Apr 06 12:00:48 crc kubenswrapper[4790]: I0406 12:00:48.711240 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-crc" Apr 06 12:00:52 crc kubenswrapper[4790]: I0406 12:00:52.403994 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Apr 06 12:00:52 crc kubenswrapper[4790]: I0406 12:00:52.404367 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Apr 06 12:00:54 crc kubenswrapper[4790]: I0406 12:00:54.358537 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.083926 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.204324 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.262458 4790 patch_prober.go:28] interesting pod/etcd-operator-b45778765-499ss container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": read tcp 10.217.0.2:41102->10.217.0.15:8443: read: connection reset by peer" start-of-body= Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.263861 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": read tcp 10.217.0.2:41102->10.217.0.15:8443: read: connection reset by peer" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.761262 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.775194 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5fdd9b5758-s2cfq_f7b20da9-8cdd-4614-9c0a-9db7287856cd/kube-scheduler-operator-container/1.log" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.775776 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" containerID="0c759347231b9d0d3fd03bdcbfcdf3ef8db30f41b34a2244d09ab10d2042cc5b" exitCode=255 Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.775876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerDied","Data":"0c759347231b9d0d3fd03bdcbfcdf3ef8db30f41b34a2244d09ab10d2042cc5b"} Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.775914 4790 scope.go:117] "RemoveContainer" containerID="1725ced72ea689dcc32f7f6df6cd63fd6ee20d43ab503dbe4f7f9a3e4ca02325" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.776884 4790 scope.go:117] "RemoveContainer" containerID="0c759347231b9d0d3fd03bdcbfcdf3ef8db30f41b34a2244d09ab10d2042cc5b" Apr 06 12:00:55 crc kubenswrapper[4790]: E0406 12:00:55.777209 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5fdd9b5758-s2cfq_openshift-kube-scheduler-operator(f7b20da9-8cdd-4614-9c0a-9db7287856cd)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" podUID="f7b20da9-8cdd-4614-9c0a-9db7287856cd" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.778439 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-b45778765-499ss_3851471a-4968-40c6-9be1-9ba072ddf741/etcd-operator/1.log" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.779570 4790 generic.go:334] "Generic (PLEG): container finished" podID="3851471a-4968-40c6-9be1-9ba072ddf741" containerID="2b7a8ea9b6c5a4a9fba55035c4ec5353a49f058524d883e7a74b5926654d46ed" exitCode=255 Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.779634 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerDied","Data":"2b7a8ea9b6c5a4a9fba55035c4ec5353a49f058524d883e7a74b5926654d46ed"} Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.780430 4790 scope.go:117] "RemoveContainer" containerID="2b7a8ea9b6c5a4a9fba55035c4ec5353a49f058524d883e7a74b5926654d46ed" Apr 06 12:00:55 crc kubenswrapper[4790]: E0406 12:00:55.780916 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=etcd-operator pod=etcd-operator-b45778765-499ss_openshift-etcd-operator(3851471a-4968-40c6-9be1-9ba072ddf741)\"" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" podUID="3851471a-4968-40c6-9be1-9ba072ddf741" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.830914 4790 scope.go:117] "RemoveContainer" containerID="3d7b8c4ff25646700d78ac43f3a8ddeb965e16df9c7ff06db3c54b2afc53b930" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.926168 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.926290 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.926382 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.926187 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.926577 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.927444 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"440b5692b9ce8088053c067135ef67be6e8abc4cda3368ab4255830624b0698d"} pod="openshift-console-operator/console-operator-58897d9998-vh9v9" containerMessage="Container console-operator failed liveness probe, will be restarted" Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.927525 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" containerID="cri-o://440b5692b9ce8088053c067135ef67be6e8abc4cda3368ab4255830624b0698d" gracePeriod=30 Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.949160 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": read tcp 10.217.0.2:42066->10.217.0.41:8443: read: connection reset by peer" start-of-body= Apr 06 12:00:55 crc kubenswrapper[4790]: I0406 12:00:55.949260 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": read tcp 10.217.0.2:42066->10.217.0.41:8443: read: connection reset by peer" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.074395 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.739506 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.787339 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-vh9v9_f0566848-e645-46bd-8e5e-fddcde1248ba/console-operator/1.log" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.788299 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-vh9v9_f0566848-e645-46bd-8e5e-fddcde1248ba/console-operator/0.log" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.788351 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerID="440b5692b9ce8088053c067135ef67be6e8abc4cda3368ab4255830624b0698d" exitCode=255 Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.788420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerDied","Data":"440b5692b9ce8088053c067135ef67be6e8abc4cda3368ab4255830624b0698d"} Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.788452 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" event={"ID":"f0566848-e645-46bd-8e5e-fddcde1248ba","Type":"ContainerStarted","Data":"7340045eda312239fbb141b583b2a48f9b4b99a2e74da1bbe730bd0c2d6f569b"} Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.788473 4790 scope.go:117] "RemoveContainer" containerID="9bed722ea06b6b94cd6e52b7cc3fad9cba84bbca1ac99e9be694c0475736ca82" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.789136 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.790635 4790 generic.go:334] "Generic (PLEG): container finished" podID="37a5e44f-9a88-4405-be8a-b645485e7312" containerID="43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a" exitCode=0 Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.790683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerDied","Data":"43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a"} Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.790941 4790 scope.go:117] "RemoveContainer" containerID="43a95c77926cab3be2c438d60d35e9e0d65ca564c72ca3f04a44860058d3548a" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.792958 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5fdd9b5758-s2cfq_f7b20da9-8cdd-4614-9c0a-9db7287856cd/kube-scheduler-operator-container/1.log" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.794921 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-b45778765-499ss_3851471a-4968-40c6-9be1-9ba072ddf741/etcd-operator/1.log" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.822507 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 06 12:00:56 crc kubenswrapper[4790]: I0406 12:00:56.988664 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.030141 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.180769 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.199949 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.539034 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.549847 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.697072 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.788848 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.788899 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.802192 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-vh9v9_f0566848-e645-46bd-8e5e-fddcde1248ba/console-operator/1.log" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.804034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bef467b8664efc68d86078171a0276156fdaf7d137e5b1c0373b4c4879b34337"} Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.881413 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 06 12:00:57 crc kubenswrapper[4790]: I0406 12:00:57.921574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.205230 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.269127 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.317515 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.404119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.477866 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.592718 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.716247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.718518 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.737523 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.803335 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-vh9v9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.803401 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" podUID="f0566848-e645-46bd-8e5e-fddcde1248ba" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.812181 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-jfjdg_743439e2-53c8-4cda-b960-2448c1fb2941/openshift-controller-manager-operator/1.log" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.812703 4790 generic.go:334] "Generic (PLEG): container finished" podID="743439e2-53c8-4cda-b960-2448c1fb2941" containerID="9419e23c42622ac01c5d40d7d22106d4a2f220c9146af998bf59f99a38ce3394" exitCode=255 Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.812758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerDied","Data":"9419e23c42622ac01c5d40d7d22106d4a2f220c9146af998bf59f99a38ce3394"} Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.812805 4790 scope.go:117] "RemoveContainer" containerID="592ebd557120949c2f5a37455001497e7b579a32a7bfcce6df3f88469210b4c7" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.813617 4790 scope.go:117] "RemoveContainer" containerID="9419e23c42622ac01c5d40d7d22106d4a2f220c9146af998bf59f99a38ce3394" Apr 06 12:00:58 crc kubenswrapper[4790]: E0406 12:00:58.814484 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-756b6f6bc6-jfjdg_openshift-controller-manager-operator(743439e2-53c8-4cda-b960-2448c1fb2941)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" podUID="743439e2-53c8-4cda-b960-2448c1fb2941" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.851237 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.864069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.909867 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 06 12:00:58 crc kubenswrapper[4790]: I0406 12:00:58.994253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.089021 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.147391 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.171263 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.185909 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.429948 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.541295 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.556685 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.621681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.645794 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.653983 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.678587 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.687859 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.865713 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.865818 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.865932 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.866135 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.866423 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.869031 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-jfjdg_743439e2-53c8-4cda-b960-2448c1fb2941/openshift-controller-manager-operator/1.log" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.883863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 06 12:00:59 crc kubenswrapper[4790]: I0406 12:00:59.903387 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.017576 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.033213 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.092306 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.102422 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.114801 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.124547 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.196394 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.204284 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.268056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.350903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.366250 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.551723 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.617956 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.689447 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.700322 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.720963 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.862171 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.891856 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.929752 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 06 12:01:00 crc kubenswrapper[4790]: I0406 12:01:00.959731 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.010427 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.037628 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.122623 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.253187 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.281484 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.339520 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.344916 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.576083 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.654409 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.819632 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.837371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.881305 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b67b599dd-pwmq6_eb424d82-17ec-4515-903c-96156334ca08/kube-storage-version-migrator-operator/1.log" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.882331 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb424d82-17ec-4515-903c-96156334ca08" containerID="1013b72c940929f2925bda0d66e264edd3a45cdf5620d4b1c5f07d0e3734cd01" exitCode=255 Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.882399 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerDied","Data":"1013b72c940929f2925bda0d66e264edd3a45cdf5620d4b1c5f07d0e3734cd01"} Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.882452 4790 scope.go:117] "RemoveContainer" containerID="a6590d7613bda71a20ff4999cb28a7b04b38d65e19773b49346f5c4b457fccc8" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.882969 4790 scope.go:117] "RemoveContainer" containerID="1013b72c940929f2925bda0d66e264edd3a45cdf5620d4b1c5f07d0e3734cd01" Apr 06 12:01:01 crc kubenswrapper[4790]: E0406 12:01:01.883207 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-b67b599dd-pwmq6_openshift-kube-storage-version-migrator-operator(eb424d82-17ec-4515-903c-96156334ca08)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" podUID="eb424d82-17ec-4515-903c-96156334ca08" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.935511 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 06 12:01:01 crc kubenswrapper[4790]: I0406 12:01:01.970680 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.051129 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.154222 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.179391 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.195261 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.206200 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.256157 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.383615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.388537 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.403552 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.403598 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.403642 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.404128 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.404228 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c" gracePeriod=30 Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.490778 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.493360 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.496838 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.506580 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.522744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.532847 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.586631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.644079 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.676163 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.794343 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.837054 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.843319 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.882757 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.889966 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b67b599dd-pwmq6_eb424d82-17ec-4515-903c-96156334ca08/kube-storage-version-migrator-operator/1.log" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.900160 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.911908 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.931729 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 06 12:01:02 crc kubenswrapper[4790]: I0406 12:01:02.974812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.024943 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.040985 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.046792 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.047525 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.076499 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.179967 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.386598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.480628 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.510663 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.596173 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.644508 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.836227 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.897295 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-796bbdcf4f-ct8gb_4d787bca-06ac-4e8b-9797-6b25ebbcc706/openshift-apiserver-operator/1.log" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.897994 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" containerID="76cbd6897462a4de9ada25c93a8031e5ac8aa105a7e457f4d31acad2f59f00e5" exitCode=255 Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.898081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerDied","Data":"76cbd6897462a4de9ada25c93a8031e5ac8aa105a7e457f4d31acad2f59f00e5"} Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.898142 4790 scope.go:117] "RemoveContainer" containerID="f6b6519b3d576b511eb396de9952044242ecd524164b0455623a7bc0b6f3a576" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.898771 4790 scope.go:117] "RemoveContainer" containerID="76cbd6897462a4de9ada25c93a8031e5ac8aa105a7e457f4d31acad2f59f00e5" Apr 06 12:01:03 crc kubenswrapper[4790]: E0406 12:01:03.899145 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-796bbdcf4f-ct8gb_openshift-apiserver-operator(4d787bca-06ac-4e8b-9797-6b25ebbcc706)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" podUID="4d787bca-06ac-4e8b-9797-6b25ebbcc706" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.899923 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-78b949d7b-rnq82_fec884c6-276b-4230-a41c-3375cbc2104b/kube-controller-manager-operator/1.log" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.900545 4790 generic.go:334] "Generic (PLEG): container finished" podID="fec884c6-276b-4230-a41c-3375cbc2104b" containerID="2c266cb742c88a644af59ebc20bd72c6875453ba55650a36c227ba4f0f237d61" exitCode=255 Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.900577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerDied","Data":"2c266cb742c88a644af59ebc20bd72c6875453ba55650a36c227ba4f0f237d61"} Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.901056 4790 scope.go:117] "RemoveContainer" containerID="2c266cb742c88a644af59ebc20bd72c6875453ba55650a36c227ba4f0f237d61" Apr 06 12:01:03 crc kubenswrapper[4790]: E0406 12:01:03.901272 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-78b949d7b-rnq82_openshift-kube-controller-manager-operator(fec884c6-276b-4230-a41c-3375cbc2104b)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" podUID="fec884c6-276b-4230-a41c-3375cbc2104b" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.942055 4790 scope.go:117] "RemoveContainer" containerID="8dcb032fb28181c105a1909b65757b322371120de591936c9a842adaae4cf508" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.944338 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.946906 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 06 12:01:03 crc kubenswrapper[4790]: I0406 12:01:03.997127 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.140052 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.238114 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.304972 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.328772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.334228 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.463078 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.555238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.566796 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.630148 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.646570 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.647733 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.648865 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.649527 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.649512636 podStartE2EDuration="41.649512636s" podCreationTimestamp="2026-04-06 12:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:00:44.799639446 +0000 UTC m=+223.787382312" watchObservedRunningTime="2026-04-06 12:01:04.649512636 +0000 UTC m=+243.637255502" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.650915 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-965d55b94-4rql6","openshift-authentication/oauth-openshift-6dbc6cf4d9-5zqlj","openshift-console/console-f9d7485db-dkg9c","openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.650969 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-965cd865f-g94wp","openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v","openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn"] Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651133 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d6679-7288-40ec-853e-345cf118c657" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651152 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d6679-7288-40ec-853e-345cf118c657" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" containerName="registry" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651174 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" containerName="registry" Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651183 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d86c525-b5ed-49c3-b95c-1e1075add472" containerName="oauth-openshift" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651188 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d86c525-b5ed-49c3-b95c-1e1075add472" containerName="oauth-openshift" Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651199 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" containerName="oc" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651205 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" containerName="oc" Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651214 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6523047-6809-4e97-8d1e-4c33d08aa1d6" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651221 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6523047-6809-4e97-8d1e-4c33d08aa1d6" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: E0406 12:01:04.651230 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651236 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651233 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651253 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="48336993-0485-4d37-a68f-3b4f5a81bb3c" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651329 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" containerName="oc" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651341 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" containerName="console" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651349 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6523047-6809-4e97-8d1e-4c33d08aa1d6" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651358 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="804d6679-7288-40ec-853e-345cf118c657" containerName="installer" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651370 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" containerName="registry" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651378 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d86c525-b5ed-49c3-b95c-1e1075add472" containerName="oauth-openshift" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.651766 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.652042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.653305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.658188 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.658519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.658622 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.658859 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.658950 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.659080 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.659295 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.659332 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660269 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660398 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660425 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660610 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660726 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.660848 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.661261 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.661429 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.661586 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.661806 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.662205 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.662377 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.662522 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.662690 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.662773 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.663264 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.663955 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.664115 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.673683 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.678146 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.682745 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.685301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.687398 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.740342 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.740323231 podStartE2EDuration="20.740323231s" podCreationTimestamp="2026-04-06 12:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:04.725062392 +0000 UTC m=+243.712805258" watchObservedRunningTime="2026-04-06 12:01:04.740323231 +0000 UTC m=+243.728066097" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.744355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797814 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-serving-cert\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797887 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-session\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-client-ca\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.797997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-proxy-ca-bundles\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798017 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-config\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-error\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798096 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-config\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqk68\" (UniqueName: \"kubernetes.io/projected/31577253-8f3c-4675-8cce-625901cffba5-kube-api-access-sqk68\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-audit-policies\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-login\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mccvx\" (UniqueName: \"kubernetes.io/projected/6dd076cb-ff23-40c8-803a-f7572cffb823-kube-api-access-mccvx\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-service-ca\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798253 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-router-certs\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd076cb-ff23-40c8-803a-f7572cffb823-serving-cert\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798576 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndll\" (UniqueName: \"kubernetes.io/projected/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-kube-api-access-xndll\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31577253-8f3c-4675-8cce-625901cffba5-audit-dir\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-client-ca\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.798931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.809749 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.840888 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.861137 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899387 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndll\" (UniqueName: \"kubernetes.io/projected/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-kube-api-access-xndll\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31577253-8f3c-4675-8cce-625901cffba5-audit-dir\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-client-ca\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-serving-cert\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31577253-8f3c-4675-8cce-625901cffba5-audit-dir\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-session\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899619 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-proxy-ca-bundles\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-client-ca\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899650 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-config\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-config\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-error\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899718 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqk68\" (UniqueName: \"kubernetes.io/projected/31577253-8f3c-4675-8cce-625901cffba5-kube-api-access-sqk68\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-audit-policies\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-login\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mccvx\" (UniqueName: \"kubernetes.io/projected/6dd076cb-ff23-40c8-803a-f7572cffb823-kube-api-access-mccvx\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899802 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-service-ca\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.899818 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-router-certs\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.900075 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd076cb-ff23-40c8-803a-f7572cffb823-serving-cert\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.900424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-client-ca\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.900756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-audit-policies\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.900954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-service-ca\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.901354 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-proxy-ca-bundles\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.901701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-config\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.902090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.902340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.903343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-client-ca\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.905035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd076cb-ff23-40c8-803a-f7572cffb823-config\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.905684 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.906123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-router-certs\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.906722 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd076cb-ff23-40c8-803a-f7572cffb823-serving-cert\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.907591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-serving-cert\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.908295 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-796bbdcf4f-ct8gb_4d787bca-06ac-4e8b-9797-6b25ebbcc706/openshift-apiserver-operator/1.log" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.908486 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-error\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.909295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.911020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-78b949d7b-rnq82_fec884c6-276b-4230-a41c-3375cbc2104b/kube-controller-manager-operator/1.log" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.911218 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-login\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.912951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.914235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-system-session\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.915538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/31577253-8f3c-4675-8cce-625901cffba5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.918925 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndll\" (UniqueName: \"kubernetes.io/projected/9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c-kube-api-access-xndll\") pod \"controller-manager-5c9dfbd65f-65l9v\" (UID: \"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c\") " pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.919221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mccvx\" (UniqueName: \"kubernetes.io/projected/6dd076cb-ff23-40c8-803a-f7572cffb823-kube-api-access-mccvx\") pod \"route-controller-manager-5f97f85695-wdgjn\" (UID: \"6dd076cb-ff23-40c8-803a-f7572cffb823\") " pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.919340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqk68\" (UniqueName: \"kubernetes.io/projected/31577253-8f3c-4675-8cce-625901cffba5-kube-api-access-sqk68\") pod \"oauth-openshift-965cd865f-g94wp\" (UID: \"31577253-8f3c-4675-8cce-625901cffba5\") " pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.930041 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vh9v9" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.937379 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.959262 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.979641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.981428 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 06 12:01:04 crc kubenswrapper[4790]: I0406 12:01:04.986013 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.000278 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.006949 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.202583 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.232495 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-965cd865f-g94wp"] Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.264442 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.273658 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v"] Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.307152 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn"] Apr 06 12:01:05 crc kubenswrapper[4790]: W0406 12:01:05.316317 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd076cb_ff23_40c8_803a_f7572cffb823.slice/crio-89b7f465d6aaac9460b3f6de493d8cf6dceae9643ab0a1ba41858e706f6cf21f WatchSource:0}: Error finding container 89b7f465d6aaac9460b3f6de493d8cf6dceae9643ab0a1ba41858e706f6cf21f: Status 404 returned error can't find the container with id 89b7f465d6aaac9460b3f6de493d8cf6dceae9643ab0a1ba41858e706f6cf21f Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.394952 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.409359 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.503302 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.518301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.682445 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d86c525-b5ed-49c3-b95c-1e1075add472" path="/var/lib/kubelet/pods/0d86c525-b5ed-49c3-b95c-1e1075add472/volumes" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.683162 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18df9b4d-88b9-46d0-adf8-90072301374e" path="/var/lib/kubelet/pods/18df9b4d-88b9-46d0-adf8-90072301374e/volumes" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.683775 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aed6119-fbb5-4557-94ee-7c3e86fc3002" path="/var/lib/kubelet/pods/1aed6119-fbb5-4557-94ee-7c3e86fc3002/volumes" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.747731 4790 ???:1] "http: TLS handshake error from 192.168.126.11:33204: no serving certificate available for the kubelet" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.782794 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.809057 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.815039 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.836604 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.880783 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.917681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" event={"ID":"31577253-8f3c-4675-8cce-625901cffba5","Type":"ContainerStarted","Data":"ae1095acd00bd0183f7753d98c735d0192705838b71e7dba076816b885500ebf"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.917726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" event={"ID":"31577253-8f3c-4675-8cce-625901cffba5","Type":"ContainerStarted","Data":"6afc64f5df76f0850f0ec551b279f39fb2c4a94fc17fbdf272a2847070113222"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.918939 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.921470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" event={"ID":"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c","Type":"ContainerStarted","Data":"5b90848ac26fe345c902dce17b5ae9fbf26f992c4afa57977903d64f18b9758d"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.921522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" event={"ID":"9a5fcb12-8983-4cb0-8bb6-9820d36a3e5c","Type":"ContainerStarted","Data":"f4e833049b4b43a9c171b54b6def87a490495aabc68954641e12c510054ff631"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.922615 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.928022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" event={"ID":"6dd076cb-ff23-40c8-803a-f7572cffb823","Type":"ContainerStarted","Data":"8c7c2e8009c1763759fb9696e4bab0181e179fb915b8f09a7435075b3575900b"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.928077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" event={"ID":"6dd076cb-ff23-40c8-803a-f7572cffb823","Type":"ContainerStarted","Data":"89b7f465d6aaac9460b3f6de493d8cf6dceae9643ab0a1ba41858e706f6cf21f"} Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.928103 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.930727 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.950504 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" podStartSLOduration=51.95047752 podStartE2EDuration="51.95047752s" podCreationTimestamp="2026-04-06 12:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:05.947341314 +0000 UTC m=+244.935084180" watchObservedRunningTime="2026-04-06 12:01:05.95047752 +0000 UTC m=+244.938220386" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.959110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-965cd865f-g94wp" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.961435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" podStartSLOduration=47.96142194 podStartE2EDuration="47.96142194s" podCreationTimestamp="2026-04-06 12:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:05.96068546 +0000 UTC m=+244.948428326" watchObservedRunningTime="2026-04-06 12:01:05.96142194 +0000 UTC m=+244.949164806" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.964569 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 06 12:01:05 crc kubenswrapper[4790]: I0406 12:01:05.992196 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c9dfbd65f-65l9v" podStartSLOduration=47.992173596 podStartE2EDuration="47.992173596s" podCreationTimestamp="2026-04-06 12:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:05.986560631 +0000 UTC m=+244.974303497" watchObservedRunningTime="2026-04-06 12:01:05.992173596 +0000 UTC m=+244.979916452" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.141571 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.161257 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.190534 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.245962 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.311357 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.364462 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.410488 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.445460 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f97f85695-wdgjn" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.675702 4790 scope.go:117] "RemoveContainer" containerID="2b7a8ea9b6c5a4a9fba55035c4ec5353a49f058524d883e7a74b5926654d46ed" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.689111 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.866194 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.891921 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.940126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-b45778765-499ss_3851471a-4968-40c6-9be1-9ba072ddf741/etcd-operator/1.log" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.940227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-499ss" event={"ID":"3851471a-4968-40c6-9be1-9ba072ddf741","Type":"ContainerStarted","Data":"bfd8744bf66a75aeff52fb49e084b6600134d2e84b24a99518ac12ed69bee53b"} Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.945875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 06 12:01:06 crc kubenswrapper[4790]: I0406 12:01:06.985869 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.015585 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.027044 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.027478 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3b08142b4e69f4f0b6d28980fcaeb0481f5a5edbb98efd67a056ceeaa56cc824" gracePeriod=5 Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.163709 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.228947 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.295805 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.362342 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.368163 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.596115 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.650297 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.671812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.857872 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 06 12:01:07 crc kubenswrapper[4790]: I0406 12:01:07.879885 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.328722 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.328973 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.329102 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.329356 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.329468 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.331946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.332672 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.332873 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.333147 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.345593 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.345721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.345618 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.348520 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.351916 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.417721 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.448544 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.544410 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.670413 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.679105 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.799433 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.950150 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 06 12:01:08 crc kubenswrapper[4790]: I0406 12:01:08.950422 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.028215 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.352285 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.396980 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.415366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.680442 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.754268 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.754400 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.781344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 06 12:01:09 crc kubenswrapper[4790]: I0406 12:01:09.912291 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.015265 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.021560 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.021655 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.119275 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.231253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.261061 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.585959 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.676012 4790 scope.go:117] "RemoveContainer" containerID="0c759347231b9d0d3fd03bdcbfcdf3ef8db30f41b34a2244d09ab10d2042cc5b" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.676630 4790 scope.go:117] "RemoveContainer" containerID="9419e23c42622ac01c5d40d7d22106d4a2f220c9146af998bf59f99a38ce3394" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.793612 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.828199 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.887136 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 06 12:01:10 crc kubenswrapper[4790]: I0406 12:01:10.887151 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.236994 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.349600 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.373726 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5fdd9b5758-s2cfq_f7b20da9-8cdd-4614-9c0a-9db7287856cd/kube-scheduler-operator-container/1.log" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.373868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s2cfq" event={"ID":"f7b20da9-8cdd-4614-9c0a-9db7287856cd","Type":"ContainerStarted","Data":"2427edb39f4b46979419a1d598704a5825b060d3fe320a5c23aa1b588ac492e8"} Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.375859 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-756b6f6bc6-jfjdg_743439e2-53c8-4cda-b960-2448c1fb2941/openshift-controller-manager-operator/1.log" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.375915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jfjdg" event={"ID":"743439e2-53c8-4cda-b960-2448c1fb2941","Type":"ContainerStarted","Data":"ece88c10afbea5ee0ace821f552d628f63a719fd64db48bb931a9cd0a916546d"} Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.420758 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.691062 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 06 12:01:11 crc kubenswrapper[4790]: I0406 12:01:11.752814 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.126385 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.221693 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.387692 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.387774 4790 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3b08142b4e69f4f0b6d28980fcaeb0481f5a5edbb98efd67a056ceeaa56cc824" exitCode=137 Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.595744 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.595823 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.701512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.701647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.702959 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703202 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703307 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703325 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.703402 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.705057 4790 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.706644 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.706675 4790 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.706701 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.713033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:01:12 crc kubenswrapper[4790]: I0406 12:01:12.808442 4790 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.399295 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.399470 4790 scope.go:117] "RemoveContainer" containerID="3b08142b4e69f4f0b6d28980fcaeb0481f5a5edbb98efd67a056ceeaa56cc824" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.399599 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.676789 4790 scope.go:117] "RemoveContainer" containerID="1013b72c940929f2925bda0d66e264edd3a45cdf5620d4b1c5f07d0e3734cd01" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.700967 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Apr 06 12:01:13 crc kubenswrapper[4790]: I0406 12:01:13.713809 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:01:14 crc kubenswrapper[4790]: I0406 12:01:14.409199 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b67b599dd-pwmq6_eb424d82-17ec-4515-903c-96156334ca08/kube-storage-version-migrator-operator/1.log" Apr 06 12:01:14 crc kubenswrapper[4790]: I0406 12:01:14.409569 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pwmq6" event={"ID":"eb424d82-17ec-4515-903c-96156334ca08","Type":"ContainerStarted","Data":"e1ec378709e65406d3b815bd84bbd98c043b25bc871f7ca61e7a180a671827ae"} Apr 06 12:01:14 crc kubenswrapper[4790]: I0406 12:01:14.676094 4790 scope.go:117] "RemoveContainer" containerID="76cbd6897462a4de9ada25c93a8031e5ac8aa105a7e457f4d31acad2f59f00e5" Apr 06 12:01:15 crc kubenswrapper[4790]: I0406 12:01:15.416893 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-796bbdcf4f-ct8gb_4d787bca-06ac-4e8b-9797-6b25ebbcc706/openshift-apiserver-operator/1.log" Apr 06 12:01:15 crc kubenswrapper[4790]: I0406 12:01:15.416970 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ct8gb" event={"ID":"4d787bca-06ac-4e8b-9797-6b25ebbcc706","Type":"ContainerStarted","Data":"f5173ce14405b1fd7c6a67d204d7b33ff0b6c4d91b1c38fbed1b17c352547c63"} Apr 06 12:01:17 crc kubenswrapper[4790]: I0406 12:01:17.675619 4790 scope.go:117] "RemoveContainer" containerID="2c266cb742c88a644af59ebc20bd72c6875453ba55650a36c227ba4f0f237d61" Apr 06 12:01:18 crc kubenswrapper[4790]: I0406 12:01:18.449747 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-78b949d7b-rnq82_fec884c6-276b-4230-a41c-3375cbc2104b/kube-controller-manager-operator/1.log" Apr 06 12:01:18 crc kubenswrapper[4790]: I0406 12:01:18.450528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnq82" event={"ID":"fec884c6-276b-4230-a41c-3375cbc2104b","Type":"ContainerStarted","Data":"b499b65b71c5e662ed3337c068ffc496650fffbb13f4a3fae1c9cdba0dc05c6e"} Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.295306 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6pqjp"] Apr 06 12:01:30 crc kubenswrapper[4790]: E0406 12:01:30.296291 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.296304 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.296396 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.296757 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.298686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.380567 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.380627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.380704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fkp\" (UniqueName: \"kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.380756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.482351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.482415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.482491 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fkp\" (UniqueName: \"kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.482556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.483035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.483100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.483309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.501583 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fkp\" (UniqueName: \"kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp\") pod \"cni-sysctl-allowlist-ds-6pqjp\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:30 crc kubenswrapper[4790]: I0406 12:01:30.615699 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.561981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" event={"ID":"2bf6b34e-ceaf-4882-b24f-31bf8deba16f","Type":"ContainerStarted","Data":"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970"} Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.562235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" event={"ID":"2bf6b34e-ceaf-4882-b24f-31bf8deba16f","Type":"ContainerStarted","Data":"feb54089dba27a50ddeccea9ce511c49db3b48f55d8ca5ed987cdb5b55e75bcc"} Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.562532 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.565761 4790 generic.go:334] "Generic (PLEG): container finished" podID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerID="9be9af31af26552cc7ce1fb89500ce58525327d0a854d30edfed0d4bab1371ba" exitCode=0 Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.565816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerDied","Data":"9be9af31af26552cc7ce1fb89500ce58525327d0a854d30edfed0d4bab1371ba"} Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.566651 4790 scope.go:117] "RemoveContainer" containerID="9be9af31af26552cc7ce1fb89500ce58525327d0a854d30edfed0d4bab1371ba" Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.589614 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" podStartSLOduration=1.58957666 podStartE2EDuration="1.58957666s" podCreationTimestamp="2026-04-06 12:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:31.582195304 +0000 UTC m=+270.569938210" watchObservedRunningTime="2026-04-06 12:01:31.58957666 +0000 UTC m=+270.577319526" Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.599433 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:01:31 crc kubenswrapper[4790]: I0406 12:01:31.690556 4790 ???:1] "http: TLS handshake error from 192.168.126.11:45054: no serving certificate available for the kubelet" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.573624 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.575551 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.576200 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.576251 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c" exitCode=137 Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.576322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c"} Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.576356 4790 scope.go:117] "RemoveContainer" containerID="14a5d345b8c42edbe390c0b15dd0f1df9a3b8bb13468ce448c5685d9ca370f6f" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.581221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerStarted","Data":"e9bf8a41d28e846deb678f6b1268ed8d68101d5b3168c7eb9d99429dc81ab25f"} Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.581626 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 12:01:32 crc kubenswrapper[4790]: I0406 12:01:32.586086 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 12:01:33 crc kubenswrapper[4790]: I0406 12:01:33.590898 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 06 12:01:33 crc kubenswrapper[4790]: I0406 12:01:33.592607 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:01:33 crc kubenswrapper[4790]: I0406 12:01:33.592712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334"} Apr 06 12:01:33 crc kubenswrapper[4790]: I0406 12:01:33.905075 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:01:39 crc kubenswrapper[4790]: I0406 12:01:39.753821 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:01:39 crc kubenswrapper[4790]: I0406 12:01:39.754444 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:01:39 crc kubenswrapper[4790]: I0406 12:01:39.754505 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:01:39 crc kubenswrapper[4790]: I0406 12:01:39.755171 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:01:39 crc kubenswrapper[4790]: I0406 12:01:39.755237 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81" gracePeriod=600 Apr 06 12:01:40 crc kubenswrapper[4790]: I0406 12:01:40.637307 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81" exitCode=0 Apr 06 12:01:40 crc kubenswrapper[4790]: I0406 12:01:40.637408 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81"} Apr 06 12:01:40 crc kubenswrapper[4790]: I0406 12:01:40.637682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2"} Apr 06 12:01:42 crc kubenswrapper[4790]: I0406 12:01:42.403652 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:01:42 crc kubenswrapper[4790]: I0406 12:01:42.408100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:01:43 crc kubenswrapper[4790]: I0406 12:01:43.913645 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.075438 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6995f97569-tdpbl"] Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.076822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.095085 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6995f97569-tdpbl"] Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-certificates\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fea0302-f405-4a4b-93f6-40ed2324f41c-installation-pull-secrets\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7w62\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-kube-api-access-q7w62\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-bound-sa-token\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-tls\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190801 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-trusted-ca\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.190978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.191038 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fea0302-f405-4a4b-93f6-40ed2324f41c-ca-trust-extracted\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.229105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.292797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-bound-sa-token\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.292888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-tls\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.292925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-trusted-ca\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.292984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fea0302-f405-4a4b-93f6-40ed2324f41c-ca-trust-extracted\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.293032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-certificates\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.293070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fea0302-f405-4a4b-93f6-40ed2324f41c-installation-pull-secrets\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.293098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7w62\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-kube-api-access-q7w62\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.293613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fea0302-f405-4a4b-93f6-40ed2324f41c-ca-trust-extracted\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.294632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-trusted-ca\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.294820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-certificates\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.316060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-registry-tls\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.317672 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fea0302-f405-4a4b-93f6-40ed2324f41c-installation-pull-secrets\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.318215 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-bound-sa-token\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.322009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7w62\" (UniqueName: \"kubernetes.io/projected/0fea0302-f405-4a4b-93f6-40ed2324f41c-kube-api-access-q7w62\") pod \"image-registry-6995f97569-tdpbl\" (UID: \"0fea0302-f405-4a4b-93f6-40ed2324f41c\") " pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.398873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:50 crc kubenswrapper[4790]: I0406 12:01:50.803134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6995f97569-tdpbl"] Apr 06 12:01:51 crc kubenswrapper[4790]: I0406 12:01:51.710118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" event={"ID":"0fea0302-f405-4a4b-93f6-40ed2324f41c","Type":"ContainerStarted","Data":"07b7a41c809a91e2ae053757609dfd879d03e0030e19d2e5587f2f18fd88bedb"} Apr 06 12:01:51 crc kubenswrapper[4790]: I0406 12:01:51.711322 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:01:51 crc kubenswrapper[4790]: I0406 12:01:51.711407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" event={"ID":"0fea0302-f405-4a4b-93f6-40ed2324f41c","Type":"ContainerStarted","Data":"4d25b7dd744bcfdd5608c8f2ab4d9082033185b6ce1dd13a850871dee23bbe44"} Apr 06 12:01:51 crc kubenswrapper[4790]: I0406 12:01:51.731427 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" podStartSLOduration=1.731405171 podStartE2EDuration="1.731405171s" podCreationTimestamp="2026-04-06 12:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:01:51.730402503 +0000 UTC m=+290.718145369" watchObservedRunningTime="2026-04-06 12:01:51.731405171 +0000 UTC m=+290.719148027" Apr 06 12:01:53 crc kubenswrapper[4790]: I0406 12:01:53.317025 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6pqjp"] Apr 06 12:01:53 crc kubenswrapper[4790]: I0406 12:01:53.318224 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" gracePeriod=30 Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.194275 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591282-x9mjb"] Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.195914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.203687 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.203735 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.203891 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.213395 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591282-x9mjb"] Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.234034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsp6\" (UniqueName: \"kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6\") pod \"auto-csr-approver-29591282-x9mjb\" (UID: \"799ced2c-8ccc-4ec8-85c2-c92617b654e8\") " pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.336013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsp6\" (UniqueName: \"kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6\") pod \"auto-csr-approver-29591282-x9mjb\" (UID: \"799ced2c-8ccc-4ec8-85c2-c92617b654e8\") " pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.364041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsp6\" (UniqueName: \"kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6\") pod \"auto-csr-approver-29591282-x9mjb\" (UID: \"799ced2c-8ccc-4ec8-85c2-c92617b654e8\") " pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.512259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:00 crc kubenswrapper[4790]: E0406 12:02:00.621803 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:00 crc kubenswrapper[4790]: E0406 12:02:00.625164 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:00 crc kubenswrapper[4790]: E0406 12:02:00.626320 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:00 crc kubenswrapper[4790]: E0406 12:02:00.626357 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.719308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591282-x9mjb"] Apr 06 12:02:00 crc kubenswrapper[4790]: W0406 12:02:00.724075 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod799ced2c_8ccc_4ec8_85c2_c92617b654e8.slice/crio-c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4 WatchSource:0}: Error finding container c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4: Status 404 returned error can't find the container with id c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4 Apr 06 12:02:00 crc kubenswrapper[4790]: I0406 12:02:00.774809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" event={"ID":"799ced2c-8ccc-4ec8-85c2-c92617b654e8","Type":"ContainerStarted","Data":"c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4"} Apr 06 12:02:02 crc kubenswrapper[4790]: I0406 12:02:02.227934 4790 csr.go:261] certificate signing request csr-ss2j6 is approved, waiting to be issued Apr 06 12:02:02 crc kubenswrapper[4790]: I0406 12:02:02.247769 4790 csr.go:257] certificate signing request csr-ss2j6 is issued Apr 06 12:02:02 crc kubenswrapper[4790]: I0406 12:02:02.786142 4790 generic.go:334] "Generic (PLEG): container finished" podID="799ced2c-8ccc-4ec8-85c2-c92617b654e8" containerID="b89b77d040bf23625e045615f350e25ce436be016e21d565b40d53681aa9bc46" exitCode=0 Apr 06 12:02:02 crc kubenswrapper[4790]: I0406 12:02:02.786407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" event={"ID":"799ced2c-8ccc-4ec8-85c2-c92617b654e8","Type":"ContainerDied","Data":"b89b77d040bf23625e045615f350e25ce436be016e21d565b40d53681aa9bc46"} Apr 06 12:02:03 crc kubenswrapper[4790]: I0406 12:02:03.248742 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 21:32:21.183249177 +0000 UTC Apr 06 12:02:03 crc kubenswrapper[4790]: I0406 12:02:03.248810 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5841h30m17.934445223s for next certificate rotation Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.118820 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.195038 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqsp6\" (UniqueName: \"kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6\") pod \"799ced2c-8ccc-4ec8-85c2-c92617b654e8\" (UID: \"799ced2c-8ccc-4ec8-85c2-c92617b654e8\") " Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.202717 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6" (OuterVolumeSpecName: "kube-api-access-kqsp6") pod "799ced2c-8ccc-4ec8-85c2-c92617b654e8" (UID: "799ced2c-8ccc-4ec8-85c2-c92617b654e8"). InnerVolumeSpecName "kube-api-access-kqsp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.249401 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 20:27:46.152606257 +0000 UTC Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.249439 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6728h25m41.90316998s for next certificate rotation Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.300852 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqsp6\" (UniqueName: \"kubernetes.io/projected/799ced2c-8ccc-4ec8-85c2-c92617b654e8-kube-api-access-kqsp6\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.802390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" event={"ID":"799ced2c-8ccc-4ec8-85c2-c92617b654e8","Type":"ContainerDied","Data":"c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4"} Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.802431 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0e56d9021869ae8459899df094cace91dc7e1bfb9e92593537c5a19ecdf46a4" Apr 06 12:02:04 crc kubenswrapper[4790]: I0406 12:02:04.802483 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591282-x9mjb" Apr 06 12:02:10 crc kubenswrapper[4790]: I0406 12:02:10.405729 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6995f97569-tdpbl" Apr 06 12:02:10 crc kubenswrapper[4790]: I0406 12:02:10.471713 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 12:02:10 crc kubenswrapper[4790]: E0406 12:02:10.618190 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:10 crc kubenswrapper[4790]: E0406 12:02:10.620289 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:10 crc kubenswrapper[4790]: E0406 12:02:10.621466 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:10 crc kubenswrapper[4790]: E0406 12:02:10.621501 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:20 crc kubenswrapper[4790]: E0406 12:02:20.618852 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:20 crc kubenswrapper[4790]: E0406 12:02:20.625317 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:20 crc kubenswrapper[4790]: E0406 12:02:20.627480 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 06 12:02:20 crc kubenswrapper[4790]: E0406 12:02:20.627572 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.431691 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6pqjp_2bf6b34e-ceaf-4882-b24f-31bf8deba16f/kube-multus-additional-cni-plugins/0.log" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.432191 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.518597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist\") pod \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.518674 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir\") pod \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.518768 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46fkp\" (UniqueName: \"kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp\") pod \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.518817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready\") pod \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\" (UID: \"2bf6b34e-ceaf-4882-b24f-31bf8deba16f\") " Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.518977 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "2bf6b34e-ceaf-4882-b24f-31bf8deba16f" (UID: "2bf6b34e-ceaf-4882-b24f-31bf8deba16f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.519428 4790 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.520673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "2bf6b34e-ceaf-4882-b24f-31bf8deba16f" (UID: "2bf6b34e-ceaf-4882-b24f-31bf8deba16f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.521051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready" (OuterVolumeSpecName: "ready") pod "2bf6b34e-ceaf-4882-b24f-31bf8deba16f" (UID: "2bf6b34e-ceaf-4882-b24f-31bf8deba16f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.531215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp" (OuterVolumeSpecName: "kube-api-access-46fkp") pod "2bf6b34e-ceaf-4882-b24f-31bf8deba16f" (UID: "2bf6b34e-ceaf-4882-b24f-31bf8deba16f"). InnerVolumeSpecName "kube-api-access-46fkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.622193 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46fkp\" (UniqueName: \"kubernetes.io/projected/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-kube-api-access-46fkp\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.622888 4790 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-ready\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.622911 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2bf6b34e-ceaf-4882-b24f-31bf8deba16f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.938996 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6pqjp_2bf6b34e-ceaf-4882-b24f-31bf8deba16f/kube-multus-additional-cni-plugins/0.log" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.939654 4790 generic.go:334] "Generic (PLEG): container finished" podID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" exitCode=137 Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.939728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" event={"ID":"2bf6b34e-ceaf-4882-b24f-31bf8deba16f","Type":"ContainerDied","Data":"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970"} Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.939812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" event={"ID":"2bf6b34e-ceaf-4882-b24f-31bf8deba16f","Type":"ContainerDied","Data":"feb54089dba27a50ddeccea9ce511c49db3b48f55d8ca5ed987cdb5b55e75bcc"} Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.939889 4790 scope.go:117] "RemoveContainer" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.939751 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6pqjp" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.971456 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6pqjp"] Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.983295 4790 scope.go:117] "RemoveContainer" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.984538 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6pqjp"] Apr 06 12:02:23 crc kubenswrapper[4790]: E0406 12:02:23.984646 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970\": container with ID starting with d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970 not found: ID does not exist" containerID="d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970" Apr 06 12:02:23 crc kubenswrapper[4790]: I0406 12:02:23.984704 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970"} err="failed to get container status \"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970\": rpc error: code = NotFound desc = could not find container \"d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970\": container with ID starting with d76e186ae2ae79f884d271bbc7f2e7b91ce38dad1af20c1e38b791b6e60bf970 not found: ID does not exist" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.452056 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 06 12:02:25 crc kubenswrapper[4790]: E0406 12:02:25.452863 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799ced2c-8ccc-4ec8-85c2-c92617b654e8" containerName="oc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.452894 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="799ced2c-8ccc-4ec8-85c2-c92617b654e8" containerName="oc" Apr 06 12:02:25 crc kubenswrapper[4790]: E0406 12:02:25.452928 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.452942 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.453133 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" containerName="kube-multus-additional-cni-plugins" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.453160 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="799ced2c-8ccc-4ec8-85c2-c92617b654e8" containerName="oc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.453942 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.456344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.459249 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.461394 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.656823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.657033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.683221 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf6b34e-ceaf-4882-b24f-31bf8deba16f" path="/var/lib/kubelet/pods/2bf6b34e-ceaf-4882-b24f-31bf8deba16f/volumes" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.764814 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.765454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.766343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.786866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access\") pod \"revision-pruner-10-crc\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:25 crc kubenswrapper[4790]: I0406 12:02:25.794657 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:26 crc kubenswrapper[4790]: I0406 12:02:26.605792 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-10-crc"] Apr 06 12:02:26 crc kubenswrapper[4790]: I0406 12:02:26.961112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"b2a4e888-053e-473c-a5d2-5aa4262f30df","Type":"ContainerStarted","Data":"3d8fbb87c4c38d07df6f46c6f83b8931b7a728be6d0a962a3713b6165f775b54"} Apr 06 12:02:27 crc kubenswrapper[4790]: I0406 12:02:27.969732 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"b2a4e888-053e-473c-a5d2-5aa4262f30df","Type":"ContainerStarted","Data":"1547e4605394dc3c1b69a75f089cffcadef996e9e643e2ab67ea794f0619e44a"} Apr 06 12:02:28 crc kubenswrapper[4790]: I0406 12:02:28.980779 4790 generic.go:334] "Generic (PLEG): container finished" podID="b2a4e888-053e-473c-a5d2-5aa4262f30df" containerID="1547e4605394dc3c1b69a75f089cffcadef996e9e643e2ab67ea794f0619e44a" exitCode=0 Apr 06 12:02:28 crc kubenswrapper[4790]: I0406 12:02:28.980869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"b2a4e888-053e-473c-a5d2-5aa4262f30df","Type":"ContainerDied","Data":"1547e4605394dc3c1b69a75f089cffcadef996e9e643e2ab67ea794f0619e44a"} Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.210719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.336517 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir\") pod \"b2a4e888-053e-473c-a5d2-5aa4262f30df\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.336607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access\") pod \"b2a4e888-053e-473c-a5d2-5aa4262f30df\" (UID: \"b2a4e888-053e-473c-a5d2-5aa4262f30df\") " Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.336764 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b2a4e888-053e-473c-a5d2-5aa4262f30df" (UID: "b2a4e888-053e-473c-a5d2-5aa4262f30df"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.336949 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2a4e888-053e-473c-a5d2-5aa4262f30df-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.343456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b2a4e888-053e-473c-a5d2-5aa4262f30df" (UID: "b2a4e888-053e-473c-a5d2-5aa4262f30df"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.438562 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2a4e888-053e-473c-a5d2-5aa4262f30df-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.548707 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.549103 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-msclk" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="registry-server" containerID="cri-o://5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b" gracePeriod=30 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.566226 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.566773 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-96trx" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="registry-server" containerID="cri-o://59e6e1b342e3291b99e433a55648dcd953aff92372a6ae30f2fe506ccbf458dd" gracePeriod=30 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.582937 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.583517 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" containerID="cri-o://e9bf8a41d28e846deb678f6b1268ed8d68101d5b3168c7eb9d99429dc81ab25f" gracePeriod=30 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.593599 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.593919 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-42rgx" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="registry-server" containerID="cri-o://bc22db92c2c9486259f5aa31f52f267884a89288ee7f94157927a276a6294c46" gracePeriod=30 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.605903 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vm6jd"] Apr 06 12:02:30 crc kubenswrapper[4790]: E0406 12:02:30.606418 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a4e888-053e-473c-a5d2-5aa4262f30df" containerName="pruner" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.606448 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a4e888-053e-473c-a5d2-5aa4262f30df" containerName="pruner" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.606698 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a4e888-053e-473c-a5d2-5aa4262f30df" containerName="pruner" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.607560 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.619018 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.619375 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mlwwf" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="registry-server" containerID="cri-o://e2f5a41a544a9b4e9d9db199769aa5a6166b001c0e25458aaeb72a312dc09673" gracePeriod=30 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.634674 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vm6jd"] Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.742957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbkq\" (UniqueName: \"kubernetes.io/projected/bcbcee2e-3daf-4238-ac27-16f663c8b184-kube-api-access-8wbkq\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.743029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.743107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.844174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbkq\" (UniqueName: \"kubernetes.io/projected/bcbcee2e-3daf-4238-ac27-16f663c8b184-kube-api-access-8wbkq\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.844263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.844313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.846199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.850277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bcbcee2e-3daf-4238-ac27-16f663c8b184-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.865958 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbkq\" (UniqueName: \"kubernetes.io/projected/bcbcee2e-3daf-4238-ac27-16f663c8b184-kube-api-access-8wbkq\") pod \"marketplace-operator-79b997595-vm6jd\" (UID: \"bcbcee2e-3daf-4238-ac27-16f663c8b184\") " pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.971980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.987456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.995721 4790 generic.go:334] "Generic (PLEG): container finished" podID="a30678d8-35eb-4863-a856-096864c2a9b1" containerID="bc22db92c2c9486259f5aa31f52f267884a89288ee7f94157927a276a6294c46" exitCode=0 Apr 06 12:02:30 crc kubenswrapper[4790]: I0406 12:02:30.995777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerDied","Data":"bc22db92c2c9486259f5aa31f52f267884a89288ee7f94157927a276a6294c46"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.000065 4790 generic.go:334] "Generic (PLEG): container finished" podID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerID="e9bf8a41d28e846deb678f6b1268ed8d68101d5b3168c7eb9d99429dc81ab25f" exitCode=0 Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.000107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerDied","Data":"e9bf8a41d28e846deb678f6b1268ed8d68101d5b3168c7eb9d99429dc81ab25f"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.000130 4790 scope.go:117] "RemoveContainer" containerID="9be9af31af26552cc7ce1fb89500ce58525327d0a854d30edfed0d4bab1371ba" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.016077 4790 generic.go:334] "Generic (PLEG): container finished" podID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerID="5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b" exitCode=0 Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.016130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-msclk" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.020866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerDied","Data":"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.020963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-msclk" event={"ID":"b9685a39-63cf-47d3-b5fe-9113d55676d4","Type":"ContainerDied","Data":"fcf639a4b4c5e6664b7789ea1435541473ab7464b764b33cb3764a8c60b6cbec"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.033142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-10-crc" event={"ID":"b2a4e888-053e-473c-a5d2-5aa4262f30df","Type":"ContainerDied","Data":"3d8fbb87c4c38d07df6f46c6f83b8931b7a728be6d0a962a3713b6165f775b54"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.033203 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d8fbb87c4c38d07df6f46c6f83b8931b7a728be6d0a962a3713b6165f775b54" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.033462 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.039607 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.066050 4790 scope.go:117] "RemoveContainer" containerID="5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.066165 4790 generic.go:334] "Generic (PLEG): container finished" podID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerID="59e6e1b342e3291b99e433a55648dcd953aff92372a6ae30f2fe506ccbf458dd" exitCode=0 Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.066481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerDied","Data":"59e6e1b342e3291b99e433a55648dcd953aff92372a6ae30f2fe506ccbf458dd"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.068093 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.075474 4790 generic.go:334] "Generic (PLEG): container finished" podID="c4be7580-8cec-4726-940d-36fb8575b791" containerID="e2f5a41a544a9b4e9d9db199769aa5a6166b001c0e25458aaeb72a312dc09673" exitCode=0 Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.075570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerDied","Data":"e2f5a41a544a9b4e9d9db199769aa5a6166b001c0e25458aaeb72a312dc09673"} Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.088849 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.106951 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.144798 4790 scope.go:117] "RemoveContainer" containerID="eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdm8v\" (UniqueName: \"kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v\") pod \"b9685a39-63cf-47d3-b5fe-9113d55676d4\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156117 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content\") pod \"b9685a39-63cf-47d3-b5fe-9113d55676d4\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156222 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca\") pod \"618ebfda-2b5c-4918-97d0-56a8b37dda29\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156251 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcwbn\" (UniqueName: \"kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn\") pod \"311eb251-79b6-4e1e-a3a7-456322ca133e\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156306 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics\") pod \"618ebfda-2b5c-4918-97d0-56a8b37dda29\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156385 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities\") pod \"b9685a39-63cf-47d3-b5fe-9113d55676d4\" (UID: \"b9685a39-63cf-47d3-b5fe-9113d55676d4\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156420 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content\") pod \"311eb251-79b6-4e1e-a3a7-456322ca133e\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156492 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75988\" (UniqueName: \"kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988\") pod \"618ebfda-2b5c-4918-97d0-56a8b37dda29\" (UID: \"618ebfda-2b5c-4918-97d0-56a8b37dda29\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.156524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities\") pod \"311eb251-79b6-4e1e-a3a7-456322ca133e\" (UID: \"311eb251-79b6-4e1e-a3a7-456322ca133e\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.157273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "618ebfda-2b5c-4918-97d0-56a8b37dda29" (UID: "618ebfda-2b5c-4918-97d0-56a8b37dda29"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.159705 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities" (OuterVolumeSpecName: "utilities") pod "311eb251-79b6-4e1e-a3a7-456322ca133e" (UID: "311eb251-79b6-4e1e-a3a7-456322ca133e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.159591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities" (OuterVolumeSpecName: "utilities") pod "b9685a39-63cf-47d3-b5fe-9113d55676d4" (UID: "b9685a39-63cf-47d3-b5fe-9113d55676d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.162239 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.163215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn" (OuterVolumeSpecName: "kube-api-access-pcwbn") pod "311eb251-79b6-4e1e-a3a7-456322ca133e" (UID: "311eb251-79b6-4e1e-a3a7-456322ca133e"). InnerVolumeSpecName "kube-api-access-pcwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.163341 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.163376 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.163539 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v" (OuterVolumeSpecName: "kube-api-access-zdm8v") pod "b9685a39-63cf-47d3-b5fe-9113d55676d4" (UID: "b9685a39-63cf-47d3-b5fe-9113d55676d4"). InnerVolumeSpecName "kube-api-access-zdm8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.164472 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "618ebfda-2b5c-4918-97d0-56a8b37dda29" (UID: "618ebfda-2b5c-4918-97d0-56a8b37dda29"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.166701 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988" (OuterVolumeSpecName: "kube-api-access-75988") pod "618ebfda-2b5c-4918-97d0-56a8b37dda29" (UID: "618ebfda-2b5c-4918-97d0-56a8b37dda29"). InnerVolumeSpecName "kube-api-access-75988". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.215358 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "311eb251-79b6-4e1e-a3a7-456322ca133e" (UID: "311eb251-79b6-4e1e-a3a7-456322ca133e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.228065 4790 scope.go:117] "RemoveContainer" containerID="c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.228185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9685a39-63cf-47d3-b5fe-9113d55676d4" (UID: "b9685a39-63cf-47d3-b5fe-9113d55676d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.249574 4790 scope.go:117] "RemoveContainer" containerID="5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.252433 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b\": container with ID starting with 5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b not found: ID does not exist" containerID="5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.252487 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b"} err="failed to get container status \"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b\": rpc error: code = NotFound desc = could not find container \"5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b\": container with ID starting with 5718550e527619db215cdf193c8018a764b083c9e4d15ee2811973e31b03b77b not found: ID does not exist" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.252517 4790 scope.go:117] "RemoveContainer" containerID="eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.253118 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863\": container with ID starting with eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863 not found: ID does not exist" containerID="eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.253146 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863"} err="failed to get container status \"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863\": rpc error: code = NotFound desc = could not find container \"eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863\": container with ID starting with eb978cc6bc38bc43b3049f35c926fd11f6e6535716b0dc816b4bc05027784863 not found: ID does not exist" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.253161 4790 scope.go:117] "RemoveContainer" containerID="c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.253505 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b\": container with ID starting with c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b not found: ID does not exist" containerID="c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.253520 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b"} err="failed to get container status \"c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b\": rpc error: code = NotFound desc = could not find container \"c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b\": container with ID starting with c37428c3574426fa6b0491a4889235fd1cac525bf01b619543194d5ac9179d1b not found: ID does not exist" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.253531 4790 scope.go:117] "RemoveContainer" containerID="59e6e1b342e3291b99e433a55648dcd953aff92372a6ae30f2fe506ccbf458dd" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264335 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5ndv\" (UniqueName: \"kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv\") pod \"a30678d8-35eb-4863-a856-096864c2a9b1\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264420 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content\") pod \"c4be7580-8cec-4726-940d-36fb8575b791\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities\") pod \"a30678d8-35eb-4863-a856-096864c2a9b1\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264515 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9zzq\" (UniqueName: \"kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq\") pod \"c4be7580-8cec-4726-940d-36fb8575b791\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content\") pod \"a30678d8-35eb-4863-a856-096864c2a9b1\" (UID: \"a30678d8-35eb-4863-a856-096864c2a9b1\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264616 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities\") pod \"c4be7580-8cec-4726-940d-36fb8575b791\" (UID: \"c4be7580-8cec-4726-940d-36fb8575b791\") " Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264969 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcwbn\" (UniqueName: \"kubernetes.io/projected/311eb251-79b6-4e1e-a3a7-456322ca133e-kube-api-access-pcwbn\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.264993 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/618ebfda-2b5c-4918-97d0-56a8b37dda29-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.265008 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/311eb251-79b6-4e1e-a3a7-456322ca133e-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.265020 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75988\" (UniqueName: \"kubernetes.io/projected/618ebfda-2b5c-4918-97d0-56a8b37dda29-kube-api-access-75988\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.265032 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdm8v\" (UniqueName: \"kubernetes.io/projected/b9685a39-63cf-47d3-b5fe-9113d55676d4-kube-api-access-zdm8v\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.265045 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9685a39-63cf-47d3-b5fe-9113d55676d4-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.266174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities" (OuterVolumeSpecName: "utilities") pod "a30678d8-35eb-4863-a856-096864c2a9b1" (UID: "a30678d8-35eb-4863-a856-096864c2a9b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.271424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq" (OuterVolumeSpecName: "kube-api-access-k9zzq") pod "c4be7580-8cec-4726-940d-36fb8575b791" (UID: "c4be7580-8cec-4726-940d-36fb8575b791"). InnerVolumeSpecName "kube-api-access-k9zzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.271676 4790 scope.go:117] "RemoveContainer" containerID="1be661ed2762a0b0bd9e6f2bb632f076e93669ffe3857c6a90e00f581a3dbec4" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.272070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities" (OuterVolumeSpecName: "utilities") pod "c4be7580-8cec-4726-940d-36fb8575b791" (UID: "c4be7580-8cec-4726-940d-36fb8575b791"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.273900 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv" (OuterVolumeSpecName: "kube-api-access-h5ndv") pod "a30678d8-35eb-4863-a856-096864c2a9b1" (UID: "a30678d8-35eb-4863-a856-096864c2a9b1"). InnerVolumeSpecName "kube-api-access-h5ndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.299273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a30678d8-35eb-4863-a856-096864c2a9b1" (UID: "a30678d8-35eb-4863-a856-096864c2a9b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.311708 4790 scope.go:117] "RemoveContainer" containerID="f51dd7c3048ce8808968f63a200bec353b5bcf8f91bb149d8d5301374d26ef8e" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.349918 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.350901 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-msclk"] Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.366662 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5ndv\" (UniqueName: \"kubernetes.io/projected/a30678d8-35eb-4863-a856-096864c2a9b1-kube-api-access-h5ndv\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.367599 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.367771 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9zzq\" (UniqueName: \"kubernetes.io/projected/c4be7580-8cec-4726-940d-36fb8575b791-kube-api-access-k9zzq\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.367849 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a30678d8-35eb-4863-a856-096864c2a9b1-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.368023 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422157 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422631 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422653 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422673 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422707 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422719 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422733 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422744 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422762 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422789 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422801 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422825 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422864 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422877 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422889 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422908 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422920 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422940 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422953 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.422978 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.422991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.423011 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423022 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="extract-utilities" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.423040 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423052 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="extract-content" Apr 06 12:02:31 crc kubenswrapper[4790]: E0406 12:02:31.423072 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423084 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423258 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423280 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" containerName="marketplace-operator" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423295 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423314 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423333 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.423351 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4be7580-8cec-4726-940d-36fb8575b791" containerName="registry-server" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.424008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.427734 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.427798 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.444217 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.460483 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4be7580-8cec-4726-940d-36fb8575b791" (UID: "c4be7580-8cec-4726-940d-36fb8575b791"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.469739 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4be7580-8cec-4726-940d-36fb8575b791-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:31 crc kubenswrapper[4790]: W0406 12:02:31.506715 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcbcee2e_3daf_4238_ac27_16f663c8b184.slice/crio-7dae33ba640748c93766182149a26b19fdbc272e4c344858d4410d4cb2c843d8 WatchSource:0}: Error finding container 7dae33ba640748c93766182149a26b19fdbc272e4c344858d4410d4cb2c843d8: Status 404 returned error can't find the container with id 7dae33ba640748c93766182149a26b19fdbc272e4c344858d4410d4cb2c843d8 Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.507208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vm6jd"] Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.571644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.571735 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.571989 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.673851 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.674298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.674439 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.674452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.674657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.690487 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9685a39-63cf-47d3-b5fe-9113d55676d4" path="/var/lib/kubelet/pods/b9685a39-63cf-47d3-b5fe-9113d55676d4/volumes" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.699264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access\") pod \"installer-10-crc\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.748173 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:02:31 crc kubenswrapper[4790]: I0406 12:02:31.944288 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-10-crc"] Apr 06 12:02:31 crc kubenswrapper[4790]: W0406 12:02:31.955689 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3fc789ff_54fe_43f7_96c0_d62e068f6238.slice/crio-66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c WatchSource:0}: Error finding container 66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c: Status 404 returned error can't find the container with id 66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.084076 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42rgx" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.084112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42rgx" event={"ID":"a30678d8-35eb-4863-a856-096864c2a9b1","Type":"ContainerDied","Data":"7570d0ab83fef6b86563b7a0e5579496e4cbea0cdf47886acee1918dd236cb39"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.084282 4790 scope.go:117] "RemoveContainer" containerID="bc22db92c2c9486259f5aa31f52f267884a89288ee7f94157927a276a6294c46" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.088733 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.088730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gzwvz" event={"ID":"618ebfda-2b5c-4918-97d0-56a8b37dda29","Type":"ContainerDied","Data":"1d4f74726b195718c1b77cd759f41d317a574d1d1c2da5f81921301f2e991ecd"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.091014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" event={"ID":"bcbcee2e-3daf-4238-ac27-16f663c8b184","Type":"ContainerStarted","Data":"0ba396502364c98dcc6f577aa2a0e590dacd51cda309dcbc34578b78fc8a4e4c"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.091077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" event={"ID":"bcbcee2e-3daf-4238-ac27-16f663c8b184","Type":"ContainerStarted","Data":"7dae33ba640748c93766182149a26b19fdbc272e4c344858d4410d4cb2c843d8"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.091238 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.094445 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"3fc789ff-54fe-43f7-96c0-d62e068f6238","Type":"ContainerStarted","Data":"66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.096891 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.097124 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-96trx" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.097128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-96trx" event={"ID":"311eb251-79b6-4e1e-a3a7-456322ca133e","Type":"ContainerDied","Data":"f5a8de717da6620c134ae7d40ab3a6e67db14123ae2d8961ebd3a69fb8465293"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.100482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlwwf" event={"ID":"c4be7580-8cec-4726-940d-36fb8575b791","Type":"ContainerDied","Data":"270b28f6fe312490fbfa019515114a0a786ec882422dbfe1b890cf6727e2aa45"} Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.100525 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlwwf" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.110864 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vm6jd" podStartSLOduration=2.110701979 podStartE2EDuration="2.110701979s" podCreationTimestamp="2026-04-06 12:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:02:32.10715447 +0000 UTC m=+331.094897336" watchObservedRunningTime="2026-04-06 12:02:32.110701979 +0000 UTC m=+331.098444845" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.118399 4790 scope.go:117] "RemoveContainer" containerID="4472c38ed0d47047495ef44e1bf3c47b36ab7a2747ba57dce8ffde6024c38dc1" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.144625 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.157655 4790 scope.go:117] "RemoveContainer" containerID="811a0ac01c70ff2567c76cd756cf3f97f3e733c4d738ebed2962f868b51cd02f" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.158684 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-42rgx"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.178469 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.195919 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-96trx"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.214336 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.224214 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mlwwf"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.232421 4790 scope.go:117] "RemoveContainer" containerID="e9bf8a41d28e846deb678f6b1268ed8d68101d5b3168c7eb9d99429dc81ab25f" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.232919 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.233161 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gzwvz"] Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.250454 4790 scope.go:117] "RemoveContainer" containerID="e2f5a41a544a9b4e9d9db199769aa5a6166b001c0e25458aaeb72a312dc09673" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.264956 4790 scope.go:117] "RemoveContainer" containerID="84fe3b3b02c7d7abe3c7ea943338fc94098471231f7267c7d4ebe02486b34d2b" Apr 06 12:02:32 crc kubenswrapper[4790]: I0406 12:02:32.288072 4790 scope.go:117] "RemoveContainer" containerID="e24b7222e26715c0c52ebb3b061305cc5878640a474acc1ceae7935cf4c4df60" Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.111935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"3fc789ff-54fe-43f7-96c0-d62e068f6238","Type":"ContainerStarted","Data":"22e566ade00852ec003a17fa4484c777e65a30e96d4616a53023e4d30267a0c4"} Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.137805 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-10-crc" podStartSLOduration=2.13569259 podStartE2EDuration="2.13569259s" podCreationTimestamp="2026-04-06 12:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:02:33.126622127 +0000 UTC m=+332.114365023" watchObservedRunningTime="2026-04-06 12:02:33.13569259 +0000 UTC m=+332.123435456" Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.684398 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311eb251-79b6-4e1e-a3a7-456322ca133e" path="/var/lib/kubelet/pods/311eb251-79b6-4e1e-a3a7-456322ca133e/volumes" Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.685320 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618ebfda-2b5c-4918-97d0-56a8b37dda29" path="/var/lib/kubelet/pods/618ebfda-2b5c-4918-97d0-56a8b37dda29/volumes" Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.685751 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30678d8-35eb-4863-a856-096864c2a9b1" path="/var/lib/kubelet/pods/a30678d8-35eb-4863-a856-096864c2a9b1/volumes" Apr 06 12:02:33 crc kubenswrapper[4790]: I0406 12:02:33.686735 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4be7580-8cec-4726-940d-36fb8575b791" path="/var/lib/kubelet/pods/c4be7580-8cec-4726-940d-36fb8575b791/volumes" Apr 06 12:02:35 crc kubenswrapper[4790]: I0406 12:02:35.525696 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" podUID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" containerName="registry" containerID="cri-o://e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306" gracePeriod=30 Apr 06 12:02:35 crc kubenswrapper[4790]: I0406 12:02:35.906594 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042615 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042679 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xzfb\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042748 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042818 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042860 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042898 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.042929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca\") pod \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\" (UID: \"8f4ae293-64ab-4efd-a511-16c6e935a2fc\") " Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.043650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.043797 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.047529 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb" (OuterVolumeSpecName: "kube-api-access-6xzfb") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "kube-api-access-6xzfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.047709 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.048939 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.050170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.055905 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.060379 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f4ae293-64ab-4efd-a511-16c6e935a2fc" (UID: "8f4ae293-64ab-4efd-a511-16c6e935a2fc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.134609 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" containerID="e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306" exitCode=0 Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.134656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" event={"ID":"8f4ae293-64ab-4efd-a511-16c6e935a2fc","Type":"ContainerDied","Data":"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306"} Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.134719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.134779 4790 scope.go:117] "RemoveContainer" containerID="e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.134822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686fc65c-fdzvb" event={"ID":"8f4ae293-64ab-4efd-a511-16c6e935a2fc","Type":"ContainerDied","Data":"c847d7e17ecff88945e357162cfdbf46664ecf3a0b4229952c5e99a29aaab611"} Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144494 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xzfb\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-kube-api-access-6xzfb\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144520 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-tls\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144532 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-registry-certificates\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144543 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f4ae293-64ab-4efd-a511-16c6e935a2fc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144552 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f4ae293-64ab-4efd-a511-16c6e935a2fc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144561 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f4ae293-64ab-4efd-a511-16c6e935a2fc-trusted-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.144569 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f4ae293-64ab-4efd-a511-16c6e935a2fc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.155136 4790 scope.go:117] "RemoveContainer" containerID="e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306" Apr 06 12:02:36 crc kubenswrapper[4790]: E0406 12:02:36.155690 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306\": container with ID starting with e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306 not found: ID does not exist" containerID="e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.155726 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306"} err="failed to get container status \"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306\": rpc error: code = NotFound desc = could not find container \"e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306\": container with ID starting with e44516814485d4a712309e05ae4d511828c150b6842baa3557cf3d93c9009306 not found: ID does not exist" Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.168711 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 12:02:36 crc kubenswrapper[4790]: I0406 12:02:36.173270 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-686fc65c-fdzvb"] Apr 06 12:02:37 crc kubenswrapper[4790]: I0406 12:02:37.681598 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" path="/var/lib/kubelet/pods/8f4ae293-64ab-4efd-a511-16c6e935a2fc/volumes" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.806789 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z2nx9"] Apr 06 12:02:43 crc kubenswrapper[4790]: E0406 12:02:43.807256 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" containerName="registry" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.807268 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" containerName="registry" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.807365 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4ae293-64ab-4efd-a511-16c6e935a2fc" containerName="registry" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.808086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.809897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.822767 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2nx9"] Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.956018 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-catalog-content\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.956055 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb6c9\" (UniqueName: \"kubernetes.io/projected/0024beb8-cee9-427c-8267-657119a613c5-kube-api-access-vb6c9\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:43 crc kubenswrapper[4790]: I0406 12:02:43.956101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-utilities\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.008146 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trpr6"] Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.009468 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.011548 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.019984 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trpr6"] Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-catalog-content\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-utilities\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-utilities\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8v44\" (UniqueName: \"kubernetes.io/projected/1524c101-74ea-4a4a-b54f-c2f9201725e1-kube-api-access-s8v44\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-catalog-content\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb6c9\" (UniqueName: \"kubernetes.io/projected/0024beb8-cee9-427c-8267-657119a613c5-kube-api-access-vb6c9\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057618 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-utilities\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.057670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0024beb8-cee9-427c-8267-657119a613c5-catalog-content\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.075193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb6c9\" (UniqueName: \"kubernetes.io/projected/0024beb8-cee9-427c-8267-657119a613c5-kube-api-access-vb6c9\") pod \"redhat-operators-z2nx9\" (UID: \"0024beb8-cee9-427c-8267-657119a613c5\") " pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.132384 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.158354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-catalog-content\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.158439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-utilities\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.158497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8v44\" (UniqueName: \"kubernetes.io/projected/1524c101-74ea-4a4a-b54f-c2f9201725e1-kube-api-access-s8v44\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.158929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-catalog-content\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.159190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1524c101-74ea-4a4a-b54f-c2f9201725e1-utilities\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.183926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8v44\" (UniqueName: \"kubernetes.io/projected/1524c101-74ea-4a4a-b54f-c2f9201725e1-kube-api-access-s8v44\") pod \"redhat-marketplace-trpr6\" (UID: \"1524c101-74ea-4a4a-b54f-c2f9201725e1\") " pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.328085 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.605923 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2nx9"] Apr 06 12:02:44 crc kubenswrapper[4790]: I0406 12:02:44.709020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trpr6"] Apr 06 12:02:44 crc kubenswrapper[4790]: W0406 12:02:44.717256 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1524c101_74ea_4a4a_b54f_c2f9201725e1.slice/crio-2c0eb41aaeb75374610fa8808a7d51e3114fbf4f78a50a39a264d9d1ddf062aa WatchSource:0}: Error finding container 2c0eb41aaeb75374610fa8808a7d51e3114fbf4f78a50a39a264d9d1ddf062aa: Status 404 returned error can't find the container with id 2c0eb41aaeb75374610fa8808a7d51e3114fbf4f78a50a39a264d9d1ddf062aa Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.202178 4790 generic.go:334] "Generic (PLEG): container finished" podID="0024beb8-cee9-427c-8267-657119a613c5" containerID="e73f30ad8d2a7cce6405e9278e2bb437fbca7f21f10649080a947a3b49692154" exitCode=0 Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.202356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2nx9" event={"ID":"0024beb8-cee9-427c-8267-657119a613c5","Type":"ContainerDied","Data":"e73f30ad8d2a7cce6405e9278e2bb437fbca7f21f10649080a947a3b49692154"} Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.202409 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2nx9" event={"ID":"0024beb8-cee9-427c-8267-657119a613c5","Type":"ContainerStarted","Data":"c67b2300d034c4485c6bcd6ce8e8ab2c7c2e6794c8293719be3a1fded197b065"} Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.208332 4790 generic.go:334] "Generic (PLEG): container finished" podID="1524c101-74ea-4a4a-b54f-c2f9201725e1" containerID="ffe5d236587e5cdd0ff213264cfd4727f827e4fab464399cb65135893ab951dc" exitCode=0 Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.208397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trpr6" event={"ID":"1524c101-74ea-4a4a-b54f-c2f9201725e1","Type":"ContainerDied","Data":"ffe5d236587e5cdd0ff213264cfd4727f827e4fab464399cb65135893ab951dc"} Apr 06 12:02:45 crc kubenswrapper[4790]: I0406 12:02:45.208469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trpr6" event={"ID":"1524c101-74ea-4a4a-b54f-c2f9201725e1","Type":"ContainerStarted","Data":"2c0eb41aaeb75374610fa8808a7d51e3114fbf4f78a50a39a264d9d1ddf062aa"} Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.223211 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hxf27"] Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.224857 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.228709 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.229002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2nx9" event={"ID":"0024beb8-cee9-427c-8267-657119a613c5","Type":"ContainerStarted","Data":"dcc215aa36dadeada2b7a0b29a00216409f52adf27555d37eff86a41a8db0fe7"} Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.236439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxf27"] Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.284802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-catalog-content\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.284915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-utilities\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.285001 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-www9b\" (UniqueName: \"kubernetes.io/projected/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-kube-api-access-www9b\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.385617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-catalog-content\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.385688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-utilities\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.385752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-www9b\" (UniqueName: \"kubernetes.io/projected/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-kube-api-access-www9b\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.386230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-catalog-content\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.386250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-utilities\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.404459 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kkc7k"] Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.405566 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.408236 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.416895 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-www9b\" (UniqueName: \"kubernetes.io/projected/7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8-kube-api-access-www9b\") pod \"community-operators-hxf27\" (UID: \"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8\") " pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.418691 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kkc7k"] Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.550305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.590764 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-catalog-content\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.590871 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7qc\" (UniqueName: \"kubernetes.io/projected/31734947-0b62-4b08-a3f7-1547b401f159-kube-api-access-8p7qc\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.590922 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-utilities\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.691950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p7qc\" (UniqueName: \"kubernetes.io/projected/31734947-0b62-4b08-a3f7-1547b401f159-kube-api-access-8p7qc\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.692181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-utilities\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.692302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-catalog-content\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.692806 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-catalog-content\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.693276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31734947-0b62-4b08-a3f7-1547b401f159-utilities\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.711468 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p7qc\" (UniqueName: \"kubernetes.io/projected/31734947-0b62-4b08-a3f7-1547b401f159-kube-api-access-8p7qc\") pod \"certified-operators-kkc7k\" (UID: \"31734947-0b62-4b08-a3f7-1547b401f159\") " pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.743893 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:46 crc kubenswrapper[4790]: I0406 12:02:46.971656 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxf27"] Apr 06 12:02:46 crc kubenswrapper[4790]: W0406 12:02:46.975871 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c08a9bd_28bf_46c6_8e2c_a9e65a1e65b8.slice/crio-55e28f9a35eb59fd1706ae0bf61d5f2266fa4a8b08cfb6980846212e9ae7c12e WatchSource:0}: Error finding container 55e28f9a35eb59fd1706ae0bf61d5f2266fa4a8b08cfb6980846212e9ae7c12e: Status 404 returned error can't find the container with id 55e28f9a35eb59fd1706ae0bf61d5f2266fa4a8b08cfb6980846212e9ae7c12e Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.137751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kkc7k"] Apr 06 12:02:47 crc kubenswrapper[4790]: W0406 12:02:47.172882 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31734947_0b62_4b08_a3f7_1547b401f159.slice/crio-4681084091f1ee14f72c3eae1bca85d604e347fb21b24f56430d8872f3c92959 WatchSource:0}: Error finding container 4681084091f1ee14f72c3eae1bca85d604e347fb21b24f56430d8872f3c92959: Status 404 returned error can't find the container with id 4681084091f1ee14f72c3eae1bca85d604e347fb21b24f56430d8872f3c92959 Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.238933 4790 generic.go:334] "Generic (PLEG): container finished" podID="7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8" containerID="dca574b0c8e5966f88635ef9dd6cc42b066d61f03bed941b69414faa0124c586" exitCode=0 Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.239237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxf27" event={"ID":"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8","Type":"ContainerDied","Data":"dca574b0c8e5966f88635ef9dd6cc42b066d61f03bed941b69414faa0124c586"} Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.239294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxf27" event={"ID":"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8","Type":"ContainerStarted","Data":"55e28f9a35eb59fd1706ae0bf61d5f2266fa4a8b08cfb6980846212e9ae7c12e"} Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.242972 4790 generic.go:334] "Generic (PLEG): container finished" podID="1524c101-74ea-4a4a-b54f-c2f9201725e1" containerID="7c098e5b066d1016df3b65d0f9ce7fb42f1e67ab842ddbbafd434a95f9604b6c" exitCode=0 Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.243029 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trpr6" event={"ID":"1524c101-74ea-4a4a-b54f-c2f9201725e1","Type":"ContainerDied","Data":"7c098e5b066d1016df3b65d0f9ce7fb42f1e67ab842ddbbafd434a95f9604b6c"} Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.247013 4790 generic.go:334] "Generic (PLEG): container finished" podID="0024beb8-cee9-427c-8267-657119a613c5" containerID="dcc215aa36dadeada2b7a0b29a00216409f52adf27555d37eff86a41a8db0fe7" exitCode=0 Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.247063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2nx9" event={"ID":"0024beb8-cee9-427c-8267-657119a613c5","Type":"ContainerDied","Data":"dcc215aa36dadeada2b7a0b29a00216409f52adf27555d37eff86a41a8db0fe7"} Apr 06 12:02:47 crc kubenswrapper[4790]: I0406 12:02:47.250707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkc7k" event={"ID":"31734947-0b62-4b08-a3f7-1547b401f159","Type":"ContainerStarted","Data":"4681084091f1ee14f72c3eae1bca85d604e347fb21b24f56430d8872f3c92959"} Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.268186 4790 generic.go:334] "Generic (PLEG): container finished" podID="31734947-0b62-4b08-a3f7-1547b401f159" containerID="4ddb858e1ae3dd9cf34de559a726eee7641db6b18e81e58340d17dfc5162ba20" exitCode=0 Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.268290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkc7k" event={"ID":"31734947-0b62-4b08-a3f7-1547b401f159","Type":"ContainerDied","Data":"4ddb858e1ae3dd9cf34de559a726eee7641db6b18e81e58340d17dfc5162ba20"} Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.270324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxf27" event={"ID":"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8","Type":"ContainerStarted","Data":"897c75d065061b154058aed3f8c573bd39fee386ee172a8da35eb060317d6b71"} Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.278475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trpr6" event={"ID":"1524c101-74ea-4a4a-b54f-c2f9201725e1","Type":"ContainerStarted","Data":"b0bff75854e481b18f1cbdea75d436d5a78bf68e595b06d1f53438bc8323cc48"} Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.281171 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2nx9" event={"ID":"0024beb8-cee9-427c-8267-657119a613c5","Type":"ContainerStarted","Data":"78baed89e52764c64e10790c6ffacbddb2e1d68b42b3f418152adc69246e2416"} Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.322214 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z2nx9" podStartSLOduration=2.914212503 podStartE2EDuration="5.322193819s" podCreationTimestamp="2026-04-06 12:02:43 +0000 UTC" firstStartedPulling="2026-04-06 12:02:45.206644753 +0000 UTC m=+344.194387659" lastFinishedPulling="2026-04-06 12:02:47.614626109 +0000 UTC m=+346.602368975" observedRunningTime="2026-04-06 12:02:48.321256833 +0000 UTC m=+347.308999729" watchObservedRunningTime="2026-04-06 12:02:48.322193819 +0000 UTC m=+347.309936685" Apr 06 12:02:48 crc kubenswrapper[4790]: I0406 12:02:48.336561 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trpr6" podStartSLOduration=2.911449506 podStartE2EDuration="5.336543129s" podCreationTimestamp="2026-04-06 12:02:43 +0000 UTC" firstStartedPulling="2026-04-06 12:02:45.211086097 +0000 UTC m=+344.198828973" lastFinishedPulling="2026-04-06 12:02:47.63617973 +0000 UTC m=+346.623922596" observedRunningTime="2026-04-06 12:02:48.33584413 +0000 UTC m=+347.323586996" watchObservedRunningTime="2026-04-06 12:02:48.336543129 +0000 UTC m=+347.324285995" Apr 06 12:02:49 crc kubenswrapper[4790]: I0406 12:02:49.288782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkc7k" event={"ID":"31734947-0b62-4b08-a3f7-1547b401f159","Type":"ContainerStarted","Data":"371ef5e23fc5c9afc663ad350335f2679c3483dd5892a85d09213c76a5c8f8e9"} Apr 06 12:02:49 crc kubenswrapper[4790]: I0406 12:02:49.290394 4790 generic.go:334] "Generic (PLEG): container finished" podID="7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8" containerID="897c75d065061b154058aed3f8c573bd39fee386ee172a8da35eb060317d6b71" exitCode=0 Apr 06 12:02:49 crc kubenswrapper[4790]: I0406 12:02:49.290468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxf27" event={"ID":"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8","Type":"ContainerDied","Data":"897c75d065061b154058aed3f8c573bd39fee386ee172a8da35eb060317d6b71"} Apr 06 12:02:50 crc kubenswrapper[4790]: I0406 12:02:50.303200 4790 generic.go:334] "Generic (PLEG): container finished" podID="31734947-0b62-4b08-a3f7-1547b401f159" containerID="371ef5e23fc5c9afc663ad350335f2679c3483dd5892a85d09213c76a5c8f8e9" exitCode=0 Apr 06 12:02:50 crc kubenswrapper[4790]: I0406 12:02:50.303273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkc7k" event={"ID":"31734947-0b62-4b08-a3f7-1547b401f159","Type":"ContainerDied","Data":"371ef5e23fc5c9afc663ad350335f2679c3483dd5892a85d09213c76a5c8f8e9"} Apr 06 12:02:50 crc kubenswrapper[4790]: I0406 12:02:50.308250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxf27" event={"ID":"7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8","Type":"ContainerStarted","Data":"313004821ca4d17eb0af5e5bc204d800668bcc7de2494b7741903c2f4b8bde6b"} Apr 06 12:02:50 crc kubenswrapper[4790]: I0406 12:02:50.355324 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hxf27" podStartSLOduration=1.928529312 podStartE2EDuration="4.355301081s" podCreationTimestamp="2026-04-06 12:02:46 +0000 UTC" firstStartedPulling="2026-04-06 12:02:47.240429444 +0000 UTC m=+346.228172300" lastFinishedPulling="2026-04-06 12:02:49.667201193 +0000 UTC m=+348.654944069" observedRunningTime="2026-04-06 12:02:50.352112041 +0000 UTC m=+349.339854947" watchObservedRunningTime="2026-04-06 12:02:50.355301081 +0000 UTC m=+349.343043967" Apr 06 12:02:51 crc kubenswrapper[4790]: I0406 12:02:51.316304 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkc7k" event={"ID":"31734947-0b62-4b08-a3f7-1547b401f159","Type":"ContainerStarted","Data":"e244136656bd2dabba41b73c546b64bb633e73e8f8e11aef315252ce83f58641"} Apr 06 12:02:51 crc kubenswrapper[4790]: I0406 12:02:51.344272 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kkc7k" podStartSLOduration=2.698338088 podStartE2EDuration="5.344251166s" podCreationTimestamp="2026-04-06 12:02:46 +0000 UTC" firstStartedPulling="2026-04-06 12:02:48.269767335 +0000 UTC m=+347.257510221" lastFinishedPulling="2026-04-06 12:02:50.915680443 +0000 UTC m=+349.903423299" observedRunningTime="2026-04-06 12:02:51.34117463 +0000 UTC m=+350.328917516" watchObservedRunningTime="2026-04-06 12:02:51.344251166 +0000 UTC m=+350.331994032" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.132941 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.133224 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.328960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.329017 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.375022 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:54 crc kubenswrapper[4790]: I0406 12:02:54.423783 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trpr6" Apr 06 12:02:55 crc kubenswrapper[4790]: I0406 12:02:55.170375 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z2nx9" podUID="0024beb8-cee9-427c-8267-657119a613c5" containerName="registry-server" probeResult="failure" output=< Apr 06 12:02:55 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:02:55 crc kubenswrapper[4790]: > Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.551166 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.551246 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.626758 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.744757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.745184 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:56 crc kubenswrapper[4790]: I0406 12:02:56.805553 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:02:57 crc kubenswrapper[4790]: I0406 12:02:57.407576 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hxf27" Apr 06 12:02:57 crc kubenswrapper[4790]: I0406 12:02:57.420793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kkc7k" Apr 06 12:03:04 crc kubenswrapper[4790]: I0406 12:03:04.173744 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:03:04 crc kubenswrapper[4790]: I0406 12:03:04.212347 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z2nx9" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.863852 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866206 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866324 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.866595 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866638 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.866663 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866638 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" containerID="cri-o://9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f" gracePeriod=15 Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866693 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475" gracePeriod=15 Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866706 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e" gracePeriod=15 Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866666 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b" gracePeriod=15 Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866647 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf" gracePeriod=15 Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.866923 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="setup" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866944 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="setup" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.866995 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867006 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.866943 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.867088 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867130 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.867153 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867162 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867430 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867460 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-syncer" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867470 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867484 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-insecure-readyz" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.867495 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.874225 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" Apr 06 12:03:09 crc kubenswrapper[4790]: E0406 12:03:09.906362 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.924961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925015 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:09 crc kubenswrapper[4790]: I0406 12:03:09.925193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.025816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.025921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026298 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026330 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026358 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.026513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.207670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:10 crc kubenswrapper[4790]: W0406 12:03:10.229370 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe484bf35d3aabad50f6e4a86d258a31.slice/crio-1c33288ae8cd7e5b19bbfe28797227f4df28bb53a6bafd893bd26f6688c38a51 WatchSource:0}: Error finding container 1c33288ae8cd7e5b19bbfe28797227f4df28bb53a6bafd893bd26f6688c38a51: Status 404 returned error can't find the container with id 1c33288ae8cd7e5b19bbfe28797227f4df28bb53a6bafd893bd26f6688c38a51 Apr 06 12:03:10 crc kubenswrapper[4790]: E0406 12:03:10.231619 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a3c2ea6d998ed9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:be484bf35d3aabad50f6e4a86d258a31,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,LastTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.445605 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.446339 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b" exitCode=0 Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.446371 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf" exitCode=0 Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.446385 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e" exitCode=0 Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.446396 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475" exitCode=2 Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.449285 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"be484bf35d3aabad50f6e4a86d258a31","Type":"ContainerStarted","Data":"1c33288ae8cd7e5b19bbfe28797227f4df28bb53a6bafd893bd26f6688c38a51"} Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.451136 4790 generic.go:334] "Generic (PLEG): container finished" podID="3fc789ff-54fe-43f7-96c0-d62e068f6238" containerID="22e566ade00852ec003a17fa4484c777e65a30e96d4616a53023e4d30267a0c4" exitCode=0 Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.451177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"3fc789ff-54fe-43f7-96c0-d62e068f6238","Type":"ContainerDied","Data":"22e566ade00852ec003a17fa4484c777e65a30e96d4616a53023e4d30267a0c4"} Apr 06 12:03:10 crc kubenswrapper[4790]: I0406 12:03:10.451890 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.457412 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"be484bf35d3aabad50f6e4a86d258a31","Type":"ContainerStarted","Data":"df4a4893c3ce7357d7310d3c7b0e0f68d4c0fc404c7086d097e7c175080f94a3"} Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.458191 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:11 crc kubenswrapper[4790]: E0406 12:03:11.458198 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.678684 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.689390 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.689945 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.850576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir\") pod \"3fc789ff-54fe-43f7-96c0-d62e068f6238\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.850640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access\") pod \"3fc789ff-54fe-43f7-96c0-d62e068f6238\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.850678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock\") pod \"3fc789ff-54fe-43f7-96c0-d62e068f6238\" (UID: \"3fc789ff-54fe-43f7-96c0-d62e068f6238\") " Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.850735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3fc789ff-54fe-43f7-96c0-d62e068f6238" (UID: "3fc789ff-54fe-43f7-96c0-d62e068f6238"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.850807 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock" (OuterVolumeSpecName: "var-lock") pod "3fc789ff-54fe-43f7-96c0-d62e068f6238" (UID: "3fc789ff-54fe-43f7-96c0-d62e068f6238"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.851075 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.851095 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc789ff-54fe-43f7-96c0-d62e068f6238-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.856397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3fc789ff-54fe-43f7-96c0-d62e068f6238" (UID: "3fc789ff-54fe-43f7-96c0-d62e068f6238"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:03:11 crc kubenswrapper[4790]: I0406 12:03:11.954490 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3fc789ff-54fe-43f7-96c0-d62e068f6238-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.187211 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.188403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.189093 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.189507 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.358708 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.358801 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.358861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"71bb4a3aecc4ba5b26c4b7318770ce13\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.359126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.359168 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.359188 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "71bb4a3aecc4ba5b26c4b7318770ce13" (UID: "71bb4a3aecc4ba5b26c4b7318770ce13"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.460078 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.460115 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.460125 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.466176 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-10-crc" event={"ID":"3fc789ff-54fe-43f7-96c0-d62e068f6238","Type":"ContainerDied","Data":"66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c"} Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.466219 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66db5fd0016385f2c57e358cb5611fdcac31e27fe19a2daaa3c7d6134fe9415c" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.466673 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-10-crc" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.469177 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-cert-syncer/0.log" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.469965 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f" exitCode=0 Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.470082 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.470229 4790 scope.go:117] "RemoveContainer" containerID="b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.470859 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.486606 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.486924 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.492150 4790 scope.go:117] "RemoveContainer" containerID="72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.508684 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.508945 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.525934 4790 scope.go:117] "RemoveContainer" containerID="ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.544549 4790 scope.go:117] "RemoveContainer" containerID="b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.570311 4790 scope.go:117] "RemoveContainer" containerID="9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.586909 4790 scope.go:117] "RemoveContainer" containerID="4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.625995 4790 scope.go:117] "RemoveContainer" containerID="b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.627226 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b\": container with ID starting with b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b not found: ID does not exist" containerID="b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.627265 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b"} err="failed to get container status \"b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b\": rpc error: code = NotFound desc = could not find container \"b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b\": container with ID starting with b07ed20e755c825a1870a04e581a3001b9c08cd391be14e022afd67adb68886b not found: ID does not exist" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.627291 4790 scope.go:117] "RemoveContainer" containerID="72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.628075 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf\": container with ID starting with 72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf not found: ID does not exist" containerID="72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628094 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf"} err="failed to get container status \"72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf\": rpc error: code = NotFound desc = could not find container \"72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf\": container with ID starting with 72097d15cb6b5e1ea2fad57c97da8a7e6d323e18e91d4fd7510a1079fa0547bf not found: ID does not exist" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628107 4790 scope.go:117] "RemoveContainer" containerID="ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.628469 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e\": container with ID starting with ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e not found: ID does not exist" containerID="ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628488 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e"} err="failed to get container status \"ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e\": rpc error: code = NotFound desc = could not find container \"ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e\": container with ID starting with ab62fdd9ef63898d74c00ff98bb7a1521cf23554a5760fb3668a8c668dd8e63e not found: ID does not exist" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628501 4790 scope.go:117] "RemoveContainer" containerID="b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.628791 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475\": container with ID starting with b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475 not found: ID does not exist" containerID="b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628819 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475"} err="failed to get container status \"b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475\": rpc error: code = NotFound desc = could not find container \"b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475\": container with ID starting with b80445a340d55190eaf37d3778fdc65546c20c27f5cc0473b8ea3e62e3620475 not found: ID does not exist" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.628851 4790 scope.go:117] "RemoveContainer" containerID="9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.629192 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f\": container with ID starting with 9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f not found: ID does not exist" containerID="9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.629218 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f"} err="failed to get container status \"9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f\": rpc error: code = NotFound desc = could not find container \"9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f\": container with ID starting with 9bf808231b351719960201cff0cc9cc6be5210fbaa4aca8d35fe2df13030489f not found: ID does not exist" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.629233 4790 scope.go:117] "RemoveContainer" containerID="4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949" Apr 06 12:03:12 crc kubenswrapper[4790]: E0406 12:03:12.629616 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949\": container with ID starting with 4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949 not found: ID does not exist" containerID="4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949" Apr 06 12:03:12 crc kubenswrapper[4790]: I0406 12:03:12.629647 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949"} err="failed to get container status \"4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949\": rpc error: code = NotFound desc = could not find container \"4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949\": container with ID starting with 4bb5bf5a1d012cf69128c241f0b7ac4c53a84c899fbe67eef2e14f9ee70cc949 not found: ID does not exist" Apr 06 12:03:13 crc kubenswrapper[4790]: I0406 12:03:13.683252 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" path="/var/lib/kubelet/pods/71bb4a3aecc4ba5b26c4b7318770ce13/volumes" Apr 06 12:03:13 crc kubenswrapper[4790]: E0406 12:03:13.937355 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a3c2ea6d998ed9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:be484bf35d3aabad50f6e4a86d258a31,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,LastTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.298009 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.299355 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.299912 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.300640 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.301195 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:18 crc kubenswrapper[4790]: I0406 12:03:18.301252 4790 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.301735 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.503256 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Apr 06 12:03:18 crc kubenswrapper[4790]: E0406 12:03:18.904187 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Apr 06 12:03:19 crc kubenswrapper[4790]: E0406 12:03:19.705033 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Apr 06 12:03:21 crc kubenswrapper[4790]: E0406 12:03:21.306523 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Apr 06 12:03:21 crc kubenswrapper[4790]: I0406 12:03:21.678888 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:22 crc kubenswrapper[4790]: I0406 12:03:22.840449 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:35330->192.168.126.11:10257: read: connection reset by peer" start-of-body= Apr 06 12:03:22 crc kubenswrapper[4790]: I0406 12:03:22.840520 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:35330->192.168.126.11:10257: read: connection reset by peer" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.548287 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.549323 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.551207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.551257 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" exitCode=1 Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.551289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334"} Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.551333 4790 scope.go:117] "RemoveContainer" containerID="bd12f079c26d6f40b634895dba92c03b64c88b317101f7db1ade5f2ba9e5db5c" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.552437 4790 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.552579 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.552910 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:23 crc kubenswrapper[4790]: E0406 12:03:23.553330 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.675146 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.676509 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.677242 4790 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.695370 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.695418 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:23 crc kubenswrapper[4790]: E0406 12:03:23.695995 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.696760 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:23 crc kubenswrapper[4790]: W0406 12:03:23.729461 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6039c7a12c5a0c0ef5917dc7ee5582.slice/crio-5f5093e662a885e08010e2c9066e50a594453997b271e2e9fab962fe4989d7fa WatchSource:0}: Error finding container 5f5093e662a885e08010e2c9066e50a594453997b271e2e9fab962fe4989d7fa: Status 404 returned error can't find the container with id 5f5093e662a885e08010e2c9066e50a594453997b271e2e9fab962fe4989d7fa Apr 06 12:03:23 crc kubenswrapper[4790]: I0406 12:03:23.905387 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:23 crc kubenswrapper[4790]: E0406 12:03:23.939162 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a3c2ea6d998ed9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:be484bf35d3aabad50f6e4a86d258a31,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,LastTimestamp:2026-04-06 12:03:10.231023321 +0000 UTC m=+369.218766187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:03:24 crc kubenswrapper[4790]: E0406 12:03:24.507687 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="6.4s" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.559381 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.561670 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.562325 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.562647 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:24 crc kubenswrapper[4790]: E0406 12:03:24.562671 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.563028 4790 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564057 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0" exitCode=0 Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564086 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerDied","Data":"8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0"} Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"5f5093e662a885e08010e2c9066e50a594453997b271e2e9fab962fe4989d7fa"} Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564301 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564317 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:24 crc kubenswrapper[4790]: E0406 12:03:24.564596 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.564755 4790 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:24 crc kubenswrapper[4790]: I0406 12:03:24.565158 4790 status_manager.go:851] "Failed to get status for pod" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" pod="openshift-kube-apiserver/installer-10-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-10-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:03:25 crc kubenswrapper[4790]: I0406 12:03:25.573858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695"} Apr 06 12:03:25 crc kubenswrapper[4790]: I0406 12:03:25.574136 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46"} Apr 06 12:03:25 crc kubenswrapper[4790]: I0406 12:03:25.574150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07"} Apr 06 12:03:25 crc kubenswrapper[4790]: I0406 12:03:25.574160 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d"} Apr 06 12:03:26 crc kubenswrapper[4790]: I0406 12:03:26.583945 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"4e6039c7a12c5a0c0ef5917dc7ee5582","Type":"ContainerStarted","Data":"dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527"} Apr 06 12:03:26 crc kubenswrapper[4790]: I0406 12:03:26.584753 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:26 crc kubenswrapper[4790]: I0406 12:03:26.584773 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:26 crc kubenswrapper[4790]: I0406 12:03:26.585098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:28 crc kubenswrapper[4790]: I0406 12:03:28.697070 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:28 crc kubenswrapper[4790]: I0406 12:03:28.698101 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:28 crc kubenswrapper[4790]: I0406 12:03:28.707171 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:31 crc kubenswrapper[4790]: I0406 12:03:31.593021 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:31 crc kubenswrapper[4790]: I0406 12:03:31.623016 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="4e6039c7a12c5a0c0ef5917dc7ee5582" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.403172 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.403751 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:03:32 crc kubenswrapper[4790]: E0406 12:03:32.404031 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.621115 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.621927 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.832232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:32 crc kubenswrapper[4790]: I0406 12:03:32.832764 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:03:32 crc kubenswrapper[4790]: E0406 12:03:32.833225 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-crc_openshift-kube-controller-manager(f614b9022728cf315e60c057852e563e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" Apr 06 12:03:33 crc kubenswrapper[4790]: I0406 12:03:33.703782 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:33 crc kubenswrapper[4790]: I0406 12:03:33.704317 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:33 crc kubenswrapper[4790]: I0406 12:03:33.704342 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b06037d2-730b-4594-9409-1b823fb6034f" Apr 06 12:03:39 crc kubenswrapper[4790]: I0406 12:03:39.753513 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:03:39 crc kubenswrapper[4790]: I0406 12:03:39.753967 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:03:40 crc kubenswrapper[4790]: I0406 12:03:40.403492 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 06 12:03:40 crc kubenswrapper[4790]: I0406 12:03:40.498654 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 06 12:03:41 crc kubenswrapper[4790]: I0406 12:03:41.019878 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 06 12:03:41 crc kubenswrapper[4790]: I0406 12:03:41.262136 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 06 12:03:41 crc kubenswrapper[4790]: I0406 12:03:41.496293 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 06 12:03:41 crc kubenswrapper[4790]: I0406 12:03:41.504759 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 06 12:03:41 crc kubenswrapper[4790]: I0406 12:03:41.706316 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="4e6039c7a12c5a0c0ef5917dc7ee5582" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.142638 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.353859 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.518770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.521465 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.601060 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.816998 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.818873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.831675 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.988721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 06 12:03:42 crc kubenswrapper[4790]: I0406 12:03:42.998819 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.088520 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.147238 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.197316 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.323356 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.656112 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.754874 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.776934 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.810366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.816958 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 06 12:03:43 crc kubenswrapper[4790]: I0406 12:03:43.977595 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.025404 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.148591 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.229392 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.269877 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.567857 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.657362 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.709849 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.758450 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.767777 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.907619 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.933371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 06 12:03:44 crc kubenswrapper[4790]: I0406 12:03:44.991757 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.128273 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.330919 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.336921 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.349254 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.351175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.462447 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.499963 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.531880 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.556038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.596349 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.648294 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.676008 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.763900 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.793714 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.810101 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.942884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.975992 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 06 12:03:45 crc kubenswrapper[4790]: I0406 12:03:45.979142 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.044134 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.062410 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.083482 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.099603 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.133509 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.361793 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.402118 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.419210 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.448666 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.450799 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.463618 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.477361 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.577912 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.624710 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.638101 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.732241 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.734954 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.735019 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7b680c55efccb036ee3a6bcfae81939fe913ce7c9bc8fbb565f257d0d8b401da"} Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.755303 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.779366 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.837959 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.838209 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.838545 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.844324 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.844386 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.849630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.865643 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.865619516 podStartE2EDuration="15.865619516s" podCreationTimestamp="2026-04-06 12:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:03:46.864919156 +0000 UTC m=+405.852662042" watchObservedRunningTime="2026-04-06 12:03:46.865619516 +0000 UTC m=+405.853362382" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.872310 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.899636 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.899976 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.944897 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.951125 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 06 12:03:46 crc kubenswrapper[4790]: I0406 12:03:46.957545 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.011703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.108942 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.270365 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.334552 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.369990 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.463299 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.509454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.650784 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.658368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.673058 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.724269 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.750139 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.817494 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.850185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.939311 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.946692 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.968202 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.979921 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.985741 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 06 12:03:47 crc kubenswrapper[4790]: I0406 12:03:47.993659 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.065557 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.070080 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.216498 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.260986 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.359150 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.599194 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.647772 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.648446 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.678024 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.731310 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.758030 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.834556 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.874499 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 06 12:03:48 crc kubenswrapper[4790]: I0406 12:03:48.896948 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.039572 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.071417 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.111880 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.176510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.186318 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.277383 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.367525 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.428417 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.428428 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.431578 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.437207 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.506094 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.572680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.611392 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.622784 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.701873 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.731395 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.784661 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.789317 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.807969 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.922888 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 06 12:03:49 crc kubenswrapper[4790]: I0406 12:03:49.999433 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.089136 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.211030 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.216788 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.244517 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.273371 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.321169 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.339355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.401766 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.401888 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.471702 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.495058 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.543495 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.586908 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.666063 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.707597 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.712644 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.779219 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.869418 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.871088 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.946348 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.958551 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 06 12:03:50 crc kubenswrapper[4790]: I0406 12:03:50.978279 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.037975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.058252 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.108601 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.283266 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.385303 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.401797 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.451581 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.578988 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.585740 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.592299 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.604867 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.624822 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.628399 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.701590 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.747799 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.800710 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.803027 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.819057 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.870166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.911079 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.974585 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 06 12:03:51 crc kubenswrapper[4790]: I0406 12:03:51.990754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.006230 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.053003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.072730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.153955 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.201452 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.254571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.300284 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.403224 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.409162 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.420642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.507446 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.551013 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.633272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.647340 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.683670 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.705358 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.774732 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.840442 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.840982 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 06 12:03:52 crc kubenswrapper[4790]: I0406 12:03:52.962081 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.176818 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.178968 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.206536 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.252476 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.256439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.269367 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.282465 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.289464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.411388 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.445491 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.474488 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.532297 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.534619 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.608308 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.678291 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.746022 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.746245 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" containerID="cri-o://df4a4893c3ce7357d7310d3c7b0e0f68d4c0fc404c7086d097e7c175080f94a3" gracePeriod=5 Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.787442 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 06 12:03:53 crc kubenswrapper[4790]: I0406 12:03:53.798021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.014866 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.018126 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.111522 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.426892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.460012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.488604 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.490662 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.712916 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.723610 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 06 12:03:54 crc kubenswrapper[4790]: I0406 12:03:54.797721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.007032 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.201244 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.241962 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.250962 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.264472 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.274498 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.398712 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.615654 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.637084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.653412 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.842998 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.929989 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 06 12:03:55 crc kubenswrapper[4790]: I0406 12:03:55.976591 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.142451 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.335761 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.339452 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.480911 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.509220 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 06 12:03:56 crc kubenswrapper[4790]: I0406 12:03:56.893181 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.139949 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.274111 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.353421 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.611040 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.853858 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 06 12:03:57 crc kubenswrapper[4790]: I0406 12:03:57.889979 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 06 12:03:58 crc kubenswrapper[4790]: I0406 12:03:58.470111 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 06 12:03:58 crc kubenswrapper[4790]: I0406 12:03:58.669998 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 06 12:03:58 crc kubenswrapper[4790]: I0406 12:03:58.816873 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 06 12:03:58 crc kubenswrapper[4790]: I0406 12:03:58.816932 4790 generic.go:334] "Generic (PLEG): container finished" podID="be484bf35d3aabad50f6e4a86d258a31" containerID="df4a4893c3ce7357d7310d3c7b0e0f68d4c0fc404c7086d097e7c175080f94a3" exitCode=137 Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.345895 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.346036 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432555 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432631 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432671 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log" (OuterVolumeSpecName: "var-log") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432801 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock" (OuterVolumeSpecName: "var-lock") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432859 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests" (OuterVolumeSpecName: "manifests") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.432879 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") pod \"be484bf35d3aabad50f6e4a86d258a31\" (UID: \"be484bf35d3aabad50f6e4a86d258a31\") " Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.434248 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.434302 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.434330 4790 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-manifests\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.434353 4790 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-var-log\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.445363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "be484bf35d3aabad50f6e4a86d258a31" (UID: "be484bf35d3aabad50f6e4a86d258a31"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.536223 4790 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/be484bf35d3aabad50f6e4a86d258a31-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.683268 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be484bf35d3aabad50f6e4a86d258a31" path="/var/lib/kubelet/pods/be484bf35d3aabad50f6e4a86d258a31/volumes" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.833793 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_be484bf35d3aabad50f6e4a86d258a31/startup-monitor/0.log" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.834443 4790 scope.go:117] "RemoveContainer" containerID="df4a4893c3ce7357d7310d3c7b0e0f68d4c0fc404c7086d097e7c175080f94a3" Apr 06 12:03:59 crc kubenswrapper[4790]: I0406 12:03:59.834500 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.088320 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591284-5wlfg"] Apr 06 12:04:02 crc kubenswrapper[4790]: E0406 12:04:02.088562 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.088574 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 06 12:04:02 crc kubenswrapper[4790]: E0406 12:04:02.088600 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" containerName="installer" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.088607 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" containerName="installer" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.088709 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc789ff-54fe-43f7-96c0-d62e068f6238" containerName="installer" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.088722 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="be484bf35d3aabad50f6e4a86d258a31" containerName="startup-monitor" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.089162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.091056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.091254 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.091890 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.101222 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591284-5wlfg"] Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.104189 4790 scope.go:117] "RemoveContainer" containerID="43286ddd309fd46a3d6c104d273a7737af676a9fc12d58c5b2de11f9bad526be" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.118810 4790 scope.go:117] "RemoveContainer" containerID="5ff3cd33d151386ab7f09a28ee0f37001f983d35e8389644ab342b12e21638f8" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.135290 4790 scope.go:117] "RemoveContainer" containerID="30056923772f422514c90950e54b7bcd14f25a556777a669012dbbb7dbcb5b6d" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.151568 4790 scope.go:117] "RemoveContainer" containerID="512ecc4b2068d96f99fe484656f014fedae4852c537e0b0aa6ee670247e3877f" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.177956 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lt7x\" (UniqueName: \"kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x\") pod \"auto-csr-approver-29591284-5wlfg\" (UID: \"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca\") " pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.279101 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lt7x\" (UniqueName: \"kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x\") pod \"auto-csr-approver-29591284-5wlfg\" (UID: \"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca\") " pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.296566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lt7x\" (UniqueName: \"kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x\") pod \"auto-csr-approver-29591284-5wlfg\" (UID: \"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca\") " pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.405269 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:02 crc kubenswrapper[4790]: I0406 12:04:02.847976 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591284-5wlfg"] Apr 06 12:04:03 crc kubenswrapper[4790]: I0406 12:04:03.871701 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" event={"ID":"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca","Type":"ContainerStarted","Data":"9daf0e8233353f01e3875ef3e11ad65ea42433249a5b9a1310af8118dd993772"} Apr 06 12:04:03 crc kubenswrapper[4790]: I0406 12:04:03.910100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:04:04 crc kubenswrapper[4790]: I0406 12:04:04.880616 4790 generic.go:334] "Generic (PLEG): container finished" podID="5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" containerID="b83253e28152a8b6b68894e72c8011aa97f31b5954d89eddf33dffebef74d1c9" exitCode=0 Apr 06 12:04:04 crc kubenswrapper[4790]: I0406 12:04:04.880707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" event={"ID":"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca","Type":"ContainerDied","Data":"b83253e28152a8b6b68894e72c8011aa97f31b5954d89eddf33dffebef74d1c9"} Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.127600 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.234108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lt7x\" (UniqueName: \"kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x\") pod \"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca\" (UID: \"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca\") " Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.239576 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x" (OuterVolumeSpecName: "kube-api-access-8lt7x") pod "5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" (UID: "5500e65f-18d1-4c5d-bf4e-9e3d55d85aca"). InnerVolumeSpecName "kube-api-access-8lt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.335690 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lt7x\" (UniqueName: \"kubernetes.io/projected/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca-kube-api-access-8lt7x\") on node \"crc\" DevicePath \"\"" Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.904582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" event={"ID":"5500e65f-18d1-4c5d-bf4e-9e3d55d85aca","Type":"ContainerDied","Data":"9daf0e8233353f01e3875ef3e11ad65ea42433249a5b9a1310af8118dd993772"} Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.905240 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9daf0e8233353f01e3875ef3e11ad65ea42433249a5b9a1310af8118dd993772" Apr 06 12:04:06 crc kubenswrapper[4790]: I0406 12:04:06.904755 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591284-5wlfg" Apr 06 12:04:09 crc kubenswrapper[4790]: I0406 12:04:09.753858 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:04:09 crc kubenswrapper[4790]: I0406 12:04:09.754562 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:04:39 crc kubenswrapper[4790]: I0406 12:04:39.753784 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:04:39 crc kubenswrapper[4790]: I0406 12:04:39.754415 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:04:39 crc kubenswrapper[4790]: I0406 12:04:39.754474 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:04:39 crc kubenswrapper[4790]: I0406 12:04:39.755173 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:04:39 crc kubenswrapper[4790]: I0406 12:04:39.755239 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2" gracePeriod=600 Apr 06 12:04:40 crc kubenswrapper[4790]: I0406 12:04:40.109708 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2" exitCode=0 Apr 06 12:04:40 crc kubenswrapper[4790]: I0406 12:04:40.109810 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2"} Apr 06 12:04:40 crc kubenswrapper[4790]: I0406 12:04:40.110055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df"} Apr 06 12:04:40 crc kubenswrapper[4790]: I0406 12:04:40.110073 4790 scope.go:117] "RemoveContainer" containerID="757aef10c0682229ea06c5e71d7d5444fe84c2dcd81f14cf08fcb20202141c81" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.148885 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591286-nlv9t"] Apr 06 12:06:00 crc kubenswrapper[4790]: E0406 12:06:00.151715 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" containerName="oc" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.151950 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" containerName="oc" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.152287 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" containerName="oc" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.153189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.156041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.156709 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.157674 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.158233 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591286-nlv9t"] Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.334223 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvbzz\" (UniqueName: \"kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz\") pod \"auto-csr-approver-29591286-nlv9t\" (UID: \"124e10ce-efae-4551-adc3-aadefc84b7a1\") " pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.436295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvbzz\" (UniqueName: \"kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz\") pod \"auto-csr-approver-29591286-nlv9t\" (UID: \"124e10ce-efae-4551-adc3-aadefc84b7a1\") " pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.484707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvbzz\" (UniqueName: \"kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz\") pod \"auto-csr-approver-29591286-nlv9t\" (UID: \"124e10ce-efae-4551-adc3-aadefc84b7a1\") " pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:00 crc kubenswrapper[4790]: I0406 12:06:00.782321 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:01 crc kubenswrapper[4790]: I0406 12:06:01.226460 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591286-nlv9t"] Apr 06 12:06:01 crc kubenswrapper[4790]: W0406 12:06:01.235936 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124e10ce_efae_4551_adc3_aadefc84b7a1.slice/crio-c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6 WatchSource:0}: Error finding container c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6: Status 404 returned error can't find the container with id c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6 Apr 06 12:06:01 crc kubenswrapper[4790]: I0406 12:06:01.239687 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:06:01 crc kubenswrapper[4790]: I0406 12:06:01.665747 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" event={"ID":"124e10ce-efae-4551-adc3-aadefc84b7a1","Type":"ContainerStarted","Data":"c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6"} Apr 06 12:06:02 crc kubenswrapper[4790]: I0406 12:06:02.672576 4790 generic.go:334] "Generic (PLEG): container finished" podID="124e10ce-efae-4551-adc3-aadefc84b7a1" containerID="d8e505868de294deb67b02738f8db679087fa521711cb29c9fc253f51753de24" exitCode=0 Apr 06 12:06:02 crc kubenswrapper[4790]: I0406 12:06:02.672649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" event={"ID":"124e10ce-efae-4551-adc3-aadefc84b7a1","Type":"ContainerDied","Data":"d8e505868de294deb67b02738f8db679087fa521711cb29c9fc253f51753de24"} Apr 06 12:06:03 crc kubenswrapper[4790]: I0406 12:06:03.971089 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.083945 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvbzz\" (UniqueName: \"kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz\") pod \"124e10ce-efae-4551-adc3-aadefc84b7a1\" (UID: \"124e10ce-efae-4551-adc3-aadefc84b7a1\") " Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.090467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz" (OuterVolumeSpecName: "kube-api-access-vvbzz") pod "124e10ce-efae-4551-adc3-aadefc84b7a1" (UID: "124e10ce-efae-4551-adc3-aadefc84b7a1"). InnerVolumeSpecName "kube-api-access-vvbzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.186419 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvbzz\" (UniqueName: \"kubernetes.io/projected/124e10ce-efae-4551-adc3-aadefc84b7a1-kube-api-access-vvbzz\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.691586 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" event={"ID":"124e10ce-efae-4551-adc3-aadefc84b7a1","Type":"ContainerDied","Data":"c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6"} Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.691652 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c937cff54aa6d481e886d0a8e884eb6b50ad2ad43a312014b61968cdff61cdd6" Apr 06 12:06:04 crc kubenswrapper[4790]: I0406 12:06:04.691669 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591286-nlv9t" Apr 06 12:06:05 crc kubenswrapper[4790]: I0406 12:06:05.038765 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591280-h6r65"] Apr 06 12:06:05 crc kubenswrapper[4790]: I0406 12:06:05.042439 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591280-h6r65"] Apr 06 12:06:05 crc kubenswrapper[4790]: I0406 12:06:05.682899 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8abd742f-e504-47d0-ab97-5befd3609dd7" path="/var/lib/kubelet/pods/8abd742f-e504-47d0-ab97-5befd3609dd7/volumes" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.022957 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 06 12:06:13 crc kubenswrapper[4790]: E0406 12:06:13.024038 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e10ce-efae-4551-adc3-aadefc84b7a1" containerName="oc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.024055 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e10ce-efae-4551-adc3-aadefc84b7a1" containerName="oc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.024195 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="124e10ce-efae-4551-adc3-aadefc84b7a1" containerName="oc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.024694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.029301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.029574 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.047283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.097403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.097459 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.097554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.198372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.198442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.198481 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.198513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.198571 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.219605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access\") pod \"installer-11-retry-1-crc\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.352620 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:13 crc kubenswrapper[4790]: I0406 12:06:13.798705 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-11-retry-1-crc"] Apr 06 12:06:14 crc kubenswrapper[4790]: I0406 12:06:14.769135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"957e5f26-72c1-4cfb-b417-d8584bab4631","Type":"ContainerStarted","Data":"39d0c0e35e8cf5fbb52890085d3342c4f3ba0394cdaa033e51cfb3951d55425c"} Apr 06 12:06:14 crc kubenswrapper[4790]: I0406 12:06:14.769494 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"957e5f26-72c1-4cfb-b417-d8584bab4631","Type":"ContainerStarted","Data":"8245752d8035e8fc235355bedce7881e8393426dd2ac0ee2392e8e19e3a02095"} Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.774086 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.775304 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://666fed052055c6c6372d36ac938b4e27b97ec0b944a3eea7e7956fcd6d9f058a" gracePeriod=30 Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.775351 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://7b680c55efccb036ee3a6bcfae81939fe913ce7c9bc8fbb565f257d0d8b401da" gracePeriod=30 Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.775437 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://2582aeab7b9255af8891d3e76d41163036f00c578ec1d9a1223b1732d0941a3d" gracePeriod=30 Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.775392 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://86d1b7c8d0d0d435b0c335d9e6690875a3f71d8c562db2d731ac0de6d1a26758" gracePeriod=30 Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776035 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776262 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776273 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776280 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776303 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776312 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776317 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776325 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776348 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776357 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776363 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776371 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776376 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776469 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-recovery-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776480 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776489 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776498 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776503 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager-cert-syncer" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776511 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: E0406 12:06:46.776596 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776603 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776686 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.776697 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.838572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.838673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.940610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.940785 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.941125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.941160 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"235e9295064844132a05dc40ef3a886a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.969161 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.970472 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.971037 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.971141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:46 crc kubenswrapper[4790]: I0406 12:06:46.975143 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.002151 4790 generic.go:334] "Generic (PLEG): container finished" podID="957e5f26-72c1-4cfb-b417-d8584bab4631" containerID="39d0c0e35e8cf5fbb52890085d3342c4f3ba0394cdaa033e51cfb3951d55425c" exitCode=0 Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.002221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"957e5f26-72c1-4cfb-b417-d8584bab4631","Type":"ContainerDied","Data":"39d0c0e35e8cf5fbb52890085d3342c4f3ba0394cdaa033e51cfb3951d55425c"} Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.004992 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.006660 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007484 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007551 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7b680c55efccb036ee3a6bcfae81939fe913ce7c9bc8fbb565f257d0d8b401da" exitCode=0 Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007576 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="86d1b7c8d0d0d435b0c335d9e6690875a3f71d8c562db2d731ac0de6d1a26758" exitCode=0 Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007590 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2582aeab7b9255af8891d3e76d41163036f00c578ec1d9a1223b1732d0941a3d" exitCode=0 Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007602 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="666fed052055c6c6372d36ac938b4e27b97ec0b944a3eea7e7956fcd6d9f058a" exitCode=2 Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007617 4790 scope.go:117] "RemoveContainer" containerID="77bd60c3801a2cc2901f0358fc397c93702d7e422a74151e8a18372622c7f334" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007634 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9dc175454fdf58a70c9257565c92f5730b5a0b3055bcfed3c5bf121d69d8ae" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.007774 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.030417 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.041614 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"f614b9022728cf315e60c057852e563e\" (UID: \"f614b9022728cf315e60c057852e563e\") " Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.041813 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f614b9022728cf315e60c057852e563e" (UID: "f614b9022728cf315e60c057852e563e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.042013 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"f614b9022728cf315e60c057852e563e\" (UID: \"f614b9022728cf315e60c057852e563e\") " Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.042070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f614b9022728cf315e60c057852e563e" (UID: "f614b9022728cf315e60c057852e563e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.042256 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.042318 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.061768 4790 scope.go:117] "RemoveContainer" containerID="5427bbbe8cc4fbd4c4b664767b3fbd65b58eb0b53e8f092a815090b9e38f31f2" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.356139 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="f614b9022728cf315e60c057852e563e" podUID="235e9295064844132a05dc40ef3a886a" Apr 06 12:06:47 crc kubenswrapper[4790]: I0406 12:06:47.685736 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f614b9022728cf315e60c057852e563e" path="/var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/volumes" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.021034 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-cert-syncer/0.log" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.275109 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock\") pod \"957e5f26-72c1-4cfb-b417-d8584bab4631\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access\") pod \"957e5f26-72c1-4cfb-b417-d8584bab4631\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363628 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir\") pod \"957e5f26-72c1-4cfb-b417-d8584bab4631\" (UID: \"957e5f26-72c1-4cfb-b417-d8584bab4631\") " Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363738 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock" (OuterVolumeSpecName: "var-lock") pod "957e5f26-72c1-4cfb-b417-d8584bab4631" (UID: "957e5f26-72c1-4cfb-b417-d8584bab4631"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "957e5f26-72c1-4cfb-b417-d8584bab4631" (UID: "957e5f26-72c1-4cfb-b417-d8584bab4631"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363937 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.363953 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/957e5f26-72c1-4cfb-b417-d8584bab4631-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.370811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "957e5f26-72c1-4cfb-b417-d8584bab4631" (UID: "957e5f26-72c1-4cfb-b417-d8584bab4631"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:06:48 crc kubenswrapper[4790]: I0406 12:06:48.464472 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/957e5f26-72c1-4cfb-b417-d8584bab4631-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:06:49 crc kubenswrapper[4790]: I0406 12:06:49.031434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" event={"ID":"957e5f26-72c1-4cfb-b417-d8584bab4631","Type":"ContainerDied","Data":"8245752d8035e8fc235355bedce7881e8393426dd2ac0ee2392e8e19e3a02095"} Apr 06 12:06:49 crc kubenswrapper[4790]: I0406 12:06:49.031515 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8245752d8035e8fc235355bedce7881e8393426dd2ac0ee2392e8e19e3a02095" Apr 06 12:06:49 crc kubenswrapper[4790]: I0406 12:06:49.031625 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-11-retry-1-crc" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.674552 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.692224 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="e57a9f92-ef28-4d4a-a0b8-8b8f332b02c3" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.692262 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="e57a9f92-ef28-4d4a-a0b8-8b8f332b02c3" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.704361 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.705093 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.713306 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.720513 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:00 crc kubenswrapper[4790]: I0406 12:07:00.727109 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:07:01 crc kubenswrapper[4790]: I0406 12:07:01.107695 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14"} Apr 06 12:07:01 crc kubenswrapper[4790]: I0406 12:07:01.108124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"afd8c906b708812d8c5306ac8f0ea6f5752b9da77bd622047f1de65badee8bd9"} Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.117597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5"} Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.117662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247"} Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.117687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"235e9295064844132a05dc40ef3a886a","Type":"ContainerStarted","Data":"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c"} Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.138059 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.138030932 podStartE2EDuration="2.138030932s" podCreationTimestamp="2026-04-06 12:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:07:02.136471608 +0000 UTC m=+601.124214504" watchObservedRunningTime="2026-04-06 12:07:02.138030932 +0000 UTC m=+601.125773838" Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.229303 4790 scope.go:117] "RemoveContainer" containerID="86d1b7c8d0d0d435b0c335d9e6690875a3f71d8c562db2d731ac0de6d1a26758" Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.243966 4790 scope.go:117] "RemoveContainer" containerID="666fed052055c6c6372d36ac938b4e27b97ec0b944a3eea7e7956fcd6d9f058a" Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.265203 4790 scope.go:117] "RemoveContainer" containerID="243ef026b58efc0cd530c2acc633b25e73f359da87ba4d52c43e99750f1767e1" Apr 06 12:07:02 crc kubenswrapper[4790]: I0406 12:07:02.282713 4790 scope.go:117] "RemoveContainer" containerID="2582aeab7b9255af8891d3e76d41163036f00c578ec1d9a1223b1732d0941a3d" Apr 06 12:07:09 crc kubenswrapper[4790]: I0406 12:07:09.753386 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:07:09 crc kubenswrapper[4790]: I0406 12:07:09.754061 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.721727 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.722132 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.722154 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.722177 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.726063 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:10 crc kubenswrapper[4790]: I0406 12:07:10.727996 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:11 crc kubenswrapper[4790]: I0406 12:07:11.189233 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:11 crc kubenswrapper[4790]: I0406 12:07:11.190137 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.599064 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 06 12:07:30 crc kubenswrapper[4790]: E0406 12:07:30.599892 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957e5f26-72c1-4cfb-b417-d8584bab4631" containerName="installer" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.599911 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="957e5f26-72c1-4cfb-b417-d8584bab4631" containerName="installer" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.600053 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="957e5f26-72c1-4cfb-b417-d8584bab4631" containerName="installer" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.600710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.603621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.603896 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.604076 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-5vhrm" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.743251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.743344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.743417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.844574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.844650 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.844697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.844785 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.844904 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.865659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access\") pod \"installer-8-crc\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:30 crc kubenswrapper[4790]: I0406 12:07:30.928309 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:07:31 crc kubenswrapper[4790]: I0406 12:07:31.173659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-8-crc"] Apr 06 12:07:31 crc kubenswrapper[4790]: I0406 12:07:31.336084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"6fc4654d-bb99-4a95-83c7-0769baa96461","Type":"ContainerStarted","Data":"48577e7977621b0e3ecbf622e90819d0e6eaf9fb3761e138f40f28f37e9da451"} Apr 06 12:07:32 crc kubenswrapper[4790]: I0406 12:07:32.343934 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"6fc4654d-bb99-4a95-83c7-0769baa96461","Type":"ContainerStarted","Data":"1feff926f638a5c01012b16744470c633e45676a15883b966c139f19d2df7df8"} Apr 06 12:07:32 crc kubenswrapper[4790]: I0406 12:07:32.359310 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-8-crc" podStartSLOduration=2.35929115 podStartE2EDuration="2.35929115s" podCreationTimestamp="2026-04-06 12:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:07:32.357186161 +0000 UTC m=+631.344929027" watchObservedRunningTime="2026-04-06 12:07:32.35929115 +0000 UTC m=+631.347034016" Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.769852 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.771385 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.774324 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.774323 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.779767 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.924796 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:35 crc kubenswrapper[4790]: I0406 12:07:35.924917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.026005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.026161 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.026233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.050764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.138169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:36 crc kubenswrapper[4790]: I0406 12:07:36.409779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-12-crc"] Apr 06 12:07:37 crc kubenswrapper[4790]: I0406 12:07:37.381816 4790 generic.go:334] "Generic (PLEG): container finished" podID="01d5af0d-6f26-4308-90dc-38ec146c0c5b" containerID="eab62af4095a12ba4715a9d890e7e8f8a05c7117a30b67594f9ad89f4c169a53" exitCode=0 Apr 06 12:07:37 crc kubenswrapper[4790]: I0406 12:07:37.381965 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"01d5af0d-6f26-4308-90dc-38ec146c0c5b","Type":"ContainerDied","Data":"eab62af4095a12ba4715a9d890e7e8f8a05c7117a30b67594f9ad89f4c169a53"} Apr 06 12:07:37 crc kubenswrapper[4790]: I0406 12:07:37.382319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"01d5af0d-6f26-4308-90dc-38ec146c0c5b","Type":"ContainerStarted","Data":"4a6722d855c5b1cf4e8d6ef1d37afc95f53730d25a77574ae6154360141be6c2"} Apr 06 12:07:37 crc kubenswrapper[4790]: I0406 12:07:37.958843 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 06 12:07:37 crc kubenswrapper[4790]: I0406 12:07:37.959594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:37.991740 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.050540 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.050767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.050968 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.152634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.152695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.152727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.152749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.152799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.170334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access\") pod \"installer-12-crc\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.339381 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.617767 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-12-crc"] Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.652900 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.761434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access\") pod \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.761991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir\") pod \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\" (UID: \"01d5af0d-6f26-4308-90dc-38ec146c0c5b\") " Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.762075 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "01d5af0d-6f26-4308-90dc-38ec146c0c5b" (UID: "01d5af0d-6f26-4308-90dc-38ec146c0c5b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.763256 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.767881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "01d5af0d-6f26-4308-90dc-38ec146c0c5b" (UID: "01d5af0d-6f26-4308-90dc-38ec146c0c5b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:07:38 crc kubenswrapper[4790]: I0406 12:07:38.864662 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01d5af0d-6f26-4308-90dc-38ec146c0c5b-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.402395 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"7b10f590-77e2-4257-8c4a-0ef40347ddc9","Type":"ContainerStarted","Data":"d2e04a9e44862dff785782bb2d076adab360972dd651708174bf793e9e1f5bc6"} Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.402484 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"7b10f590-77e2-4257-8c4a-0ef40347ddc9","Type":"ContainerStarted","Data":"2e0fdba1759fcddf10bf102b4fcf7eb82d797a81c0daa7fb31c970c924b15673"} Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.405106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-12-crc" event={"ID":"01d5af0d-6f26-4308-90dc-38ec146c0c5b","Type":"ContainerDied","Data":"4a6722d855c5b1cf4e8d6ef1d37afc95f53730d25a77574ae6154360141be6c2"} Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.405168 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a6722d855c5b1cf4e8d6ef1d37afc95f53730d25a77574ae6154360141be6c2" Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.405236 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-12-crc" Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.431649 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-12-crc" podStartSLOduration=2.431631082 podStartE2EDuration="2.431631082s" podCreationTimestamp="2026-04-06 12:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:07:39.428438232 +0000 UTC m=+638.416181138" watchObservedRunningTime="2026-04-06 12:07:39.431631082 +0000 UTC m=+638.419373948" Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.754283 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:07:39 crc kubenswrapper[4790]: I0406 12:07:39.754709 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.618434 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tm4jb"] Apr 06 12:07:42 crc kubenswrapper[4790]: E0406 12:07:42.619095 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d5af0d-6f26-4308-90dc-38ec146c0c5b" containerName="pruner" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.619132 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d5af0d-6f26-4308-90dc-38ec146c0c5b" containerName="pruner" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.619284 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d5af0d-6f26-4308-90dc-38ec146c0c5b" containerName="pruner" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.619765 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.621824 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gjczx" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.622185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.622185 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.625159 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m"] Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.626683 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.630371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z7hj5" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.631749 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-jmqjq"] Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.632611 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jmqjq" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.634279 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tqjjd" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.638430 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jmqjq"] Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.658021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m"] Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.666594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tm4jb"] Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.711223 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9lmz\" (UniqueName: \"kubernetes.io/projected/0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7-kube-api-access-v9lmz\") pod \"cert-manager-webhook-687f57d79b-tm4jb\" (UID: \"0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.711286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd74n\" (UniqueName: \"kubernetes.io/projected/ec02624d-3d5a-423d-818a-1422646a42a9-kube-api-access-cd74n\") pod \"cert-manager-cainjector-cf98fcc89-n5t6m\" (UID: \"ec02624d-3d5a-423d-818a-1422646a42a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.711351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shnht\" (UniqueName: \"kubernetes.io/projected/7b2dea07-951f-4a31-ae96-5465449fbae8-kube-api-access-shnht\") pod \"cert-manager-858654f9db-jmqjq\" (UID: \"7b2dea07-951f-4a31-ae96-5465449fbae8\") " pod="cert-manager/cert-manager-858654f9db-jmqjq" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.812821 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shnht\" (UniqueName: \"kubernetes.io/projected/7b2dea07-951f-4a31-ae96-5465449fbae8-kube-api-access-shnht\") pod \"cert-manager-858654f9db-jmqjq\" (UID: \"7b2dea07-951f-4a31-ae96-5465449fbae8\") " pod="cert-manager/cert-manager-858654f9db-jmqjq" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.813120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9lmz\" (UniqueName: \"kubernetes.io/projected/0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7-kube-api-access-v9lmz\") pod \"cert-manager-webhook-687f57d79b-tm4jb\" (UID: \"0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.813285 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd74n\" (UniqueName: \"kubernetes.io/projected/ec02624d-3d5a-423d-818a-1422646a42a9-kube-api-access-cd74n\") pod \"cert-manager-cainjector-cf98fcc89-n5t6m\" (UID: \"ec02624d-3d5a-423d-818a-1422646a42a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.838599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd74n\" (UniqueName: \"kubernetes.io/projected/ec02624d-3d5a-423d-818a-1422646a42a9-kube-api-access-cd74n\") pod \"cert-manager-cainjector-cf98fcc89-n5t6m\" (UID: \"ec02624d-3d5a-423d-818a-1422646a42a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.841302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9lmz\" (UniqueName: \"kubernetes.io/projected/0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7-kube-api-access-v9lmz\") pod \"cert-manager-webhook-687f57d79b-tm4jb\" (UID: \"0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.842622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shnht\" (UniqueName: \"kubernetes.io/projected/7b2dea07-951f-4a31-ae96-5465449fbae8-kube-api-access-shnht\") pod \"cert-manager-858654f9db-jmqjq\" (UID: \"7b2dea07-951f-4a31-ae96-5465449fbae8\") " pod="cert-manager/cert-manager-858654f9db-jmqjq" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.940103 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.951216 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" Apr 06 12:07:42 crc kubenswrapper[4790]: I0406 12:07:42.958668 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-jmqjq" Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.162660 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tm4jb"] Apr 06 12:07:43 crc kubenswrapper[4790]: W0406 12:07:43.173689 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aeb284c_cf28_4dc2_aa1c_d43f80e4fba7.slice/crio-fdef56393ff31930e5e79549c20228c35748c015d208915af8fef3f93e1b99c7 WatchSource:0}: Error finding container fdef56393ff31930e5e79549c20228c35748c015d208915af8fef3f93e1b99c7: Status 404 returned error can't find the container with id fdef56393ff31930e5e79549c20228c35748c015d208915af8fef3f93e1b99c7 Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.224166 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m"] Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.422664 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-jmqjq"] Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.436768 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" event={"ID":"ec02624d-3d5a-423d-818a-1422646a42a9","Type":"ContainerStarted","Data":"e3ebad227e42eec78c8fbc871944bb69677a34a5e6c600a171c0c3a9792e8916"} Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.437936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" event={"ID":"0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7","Type":"ContainerStarted","Data":"fdef56393ff31930e5e79549c20228c35748c015d208915af8fef3f93e1b99c7"} Apr 06 12:07:43 crc kubenswrapper[4790]: I0406 12:07:43.438753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jmqjq" event={"ID":"7b2dea07-951f-4a31-ae96-5465449fbae8","Type":"ContainerStarted","Data":"a7309134f298ab433648b75f1bae5607bb5a2640a5e00b68328b5148c6fa3d5e"} Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.021336 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.022493 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.026244 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.026530 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.029170 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.144908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.144954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.246641 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.246700 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.246777 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.283118 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:45 crc kubenswrapper[4790]: I0406 12:07:45.349600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:46 crc kubenswrapper[4790]: I0406 12:07:46.386262 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Apr 06 12:07:46 crc kubenswrapper[4790]: W0406 12:07:46.820393 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod70ab45f9_e862_4258_83ad_9bbdb50c0531.slice/crio-7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212 WatchSource:0}: Error finding container 7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212: Status 404 returned error can't find the container with id 7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212 Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.466719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jmqjq" event={"ID":"7b2dea07-951f-4a31-ae96-5465449fbae8","Type":"ContainerStarted","Data":"7a8df8ef8fdb7a60b04cfb580ca2143c0e08077cf541f0a0a0ff82dea8cbe9cf"} Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.469708 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"70ab45f9-e862-4258-83ad-9bbdb50c0531","Type":"ContainerStarted","Data":"688ae84df562aa26de9c5bf675fd6da50567e5da988ed35a90ad465874995610"} Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.469749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"70ab45f9-e862-4258-83ad-9bbdb50c0531","Type":"ContainerStarted","Data":"7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212"} Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.472294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" event={"ID":"ec02624d-3d5a-423d-818a-1422646a42a9","Type":"ContainerStarted","Data":"cfc2179178df0d70055695864cb18ddabe5a00a0afe65809e20d2bafb1cecd61"} Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.473574 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" event={"ID":"0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7","Type":"ContainerStarted","Data":"2be7ff2be921654410b59af591bf6db1c39712457eb272f74f02f0904055303c"} Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.473810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.489254 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-jmqjq" podStartSLOduration=1.906570462 podStartE2EDuration="5.48920709s" podCreationTimestamp="2026-04-06 12:07:42 +0000 UTC" firstStartedPulling="2026-04-06 12:07:43.428474846 +0000 UTC m=+642.416217712" lastFinishedPulling="2026-04-06 12:07:47.011111474 +0000 UTC m=+645.998854340" observedRunningTime="2026-04-06 12:07:47.483740887 +0000 UTC m=+646.471483763" watchObservedRunningTime="2026-04-06 12:07:47.48920709 +0000 UTC m=+646.476949956" Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.497903 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" podStartSLOduration=1.792693581 podStartE2EDuration="5.497880913s" podCreationTimestamp="2026-04-06 12:07:42 +0000 UTC" firstStartedPulling="2026-04-06 12:07:43.228239825 +0000 UTC m=+642.215982701" lastFinishedPulling="2026-04-06 12:07:46.933427167 +0000 UTC m=+645.921170033" observedRunningTime="2026-04-06 12:07:47.495412024 +0000 UTC m=+646.483154910" watchObservedRunningTime="2026-04-06 12:07:47.497880913 +0000 UTC m=+646.485623789" Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.516124 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=3.516100454 podStartE2EDuration="3.516100454s" podCreationTimestamp="2026-04-06 12:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:07:47.514064267 +0000 UTC m=+646.501807123" watchObservedRunningTime="2026-04-06 12:07:47.516100454 +0000 UTC m=+646.503843320" Apr 06 12:07:47 crc kubenswrapper[4790]: I0406 12:07:47.534496 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" podStartSLOduration=1.777027682 podStartE2EDuration="5.534468918s" podCreationTimestamp="2026-04-06 12:07:42 +0000 UTC" firstStartedPulling="2026-04-06 12:07:43.176731202 +0000 UTC m=+642.164474058" lastFinishedPulling="2026-04-06 12:07:46.934172438 +0000 UTC m=+645.921915294" observedRunningTime="2026-04-06 12:07:47.531744382 +0000 UTC m=+646.519487248" watchObservedRunningTime="2026-04-06 12:07:47.534468918 +0000 UTC m=+646.522211794" Apr 06 12:07:48 crc kubenswrapper[4790]: I0406 12:07:48.483027 4790 generic.go:334] "Generic (PLEG): container finished" podID="70ab45f9-e862-4258-83ad-9bbdb50c0531" containerID="688ae84df562aa26de9c5bf675fd6da50567e5da988ed35a90ad465874995610" exitCode=0 Apr 06 12:07:48 crc kubenswrapper[4790]: I0406 12:07:48.483090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"70ab45f9-e862-4258-83ad-9bbdb50c0531","Type":"ContainerDied","Data":"688ae84df562aa26de9c5bf675fd6da50567e5da988ed35a90ad465874995610"} Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.699853 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.808062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir\") pod \"70ab45f9-e862-4258-83ad-9bbdb50c0531\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.808139 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access\") pod \"70ab45f9-e862-4258-83ad-9bbdb50c0531\" (UID: \"70ab45f9-e862-4258-83ad-9bbdb50c0531\") " Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.808200 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70ab45f9-e862-4258-83ad-9bbdb50c0531" (UID: "70ab45f9-e862-4258-83ad-9bbdb50c0531"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.808499 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ab45f9-e862-4258-83ad-9bbdb50c0531-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.817037 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70ab45f9-e862-4258-83ad-9bbdb50c0531" (UID: "70ab45f9-e862-4258-83ad-9bbdb50c0531"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:07:49 crc kubenswrapper[4790]: I0406 12:07:49.909271 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70ab45f9-e862-4258-83ad-9bbdb50c0531-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:50 crc kubenswrapper[4790]: I0406 12:07:50.501659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"70ab45f9-e862-4258-83ad-9bbdb50c0531","Type":"ContainerDied","Data":"7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212"} Apr 06 12:07:50 crc kubenswrapper[4790]: I0406 12:07:50.501725 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Apr 06 12:07:50 crc kubenswrapper[4790]: I0406 12:07:50.501729 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ef5298d80eb42e8456c4cd163ed2868d4cd52d9e6a1c230cdeb0624bc71c212" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.211505 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 06 12:07:52 crc kubenswrapper[4790]: E0406 12:07:52.211754 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ab45f9-e862-4258-83ad-9bbdb50c0531" containerName="pruner" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.211767 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ab45f9-e862-4258-83ad-9bbdb50c0531" containerName="pruner" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.211911 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ab45f9-e862-4258-83ad-9bbdb50c0531" containerName="pruner" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.212263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.215312 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.215492 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.220939 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.341694 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.341768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.341937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.443336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.443437 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.443495 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.443540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.443657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.472811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access\") pod \"installer-11-crc\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.530645 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.667526 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5c5h9"] Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.668314 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-controller" containerID="cri-o://6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.668779 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="sbdb" containerID="cri-o://086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.673559 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="nbdb" containerID="cri-o://1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.673667 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-node" containerID="cri-o://2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.673856 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="northd" containerID="cri-o://a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.673920 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.673974 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-acl-logging" containerID="cri-o://cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.713444 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovnkube-controller" containerID="cri-o://4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" gracePeriod=30 Apr 06 12:07:52 crc kubenswrapper[4790]: E0406 12:07:52.747301 4790 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(d31c2b7d890843f14e05d3bc5b92be55c297ba3d69b02aae5d598cc1abdf53ac): error adding pod openshift-kube-apiserver_installer-11-crc to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Apr 06 12:07:52 crc kubenswrapper[4790]: E0406 12:07:52.747454 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(d31c2b7d890843f14e05d3bc5b92be55c297ba3d69b02aae5d598cc1abdf53ac): error adding pod openshift-kube-apiserver_installer-11-crc to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: E0406 12:07:52.747524 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(d31c2b7d890843f14e05d3bc5b92be55c297ba3d69b02aae5d598cc1abdf53ac): error adding pod openshift-kube-apiserver_installer-11-crc to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:52 crc kubenswrapper[4790]: E0406 12:07:52.747622 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-11-crc_openshift-kube-apiserver(ae18ba0c-922e-43b1-8a32-85e498925b20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-11-crc_openshift-kube-apiserver(ae18ba0c-922e-43b1-8a32-85e498925b20)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(d31c2b7d890843f14e05d3bc5b92be55c297ba3d69b02aae5d598cc1abdf53ac): error adding pod openshift-kube-apiserver_installer-11-crc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-apiserver/installer-11-crc" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.943532 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-tm4jb" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.946489 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5c5h9_20b621ff-e3f4-40ce-9c77-2292304e36af/ovn-acl-logging/0.log" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.947098 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5c5h9_20b621ff-e3f4-40ce-9c77-2292304e36af/ovn-controller/0.log" Apr 06 12:07:52 crc kubenswrapper[4790]: I0406 12:07:52.948139 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.013798 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f4lsb"] Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="northd" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="northd" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014078 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-ovn-metrics" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014086 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-ovn-metrics" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014099 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kubecfg-setup" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014106 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kubecfg-setup" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014115 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="nbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014120 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="nbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014130 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-acl-logging" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014137 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-acl-logging" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014147 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="sbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014153 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="sbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014162 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovnkube-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014168 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovnkube-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014179 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014185 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.014193 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-node" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014198 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-node" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014284 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-acl-logging" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014297 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-node" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014307 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="sbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014316 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="kube-rbac-proxy-ovn-metrics" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014323 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovnkube-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014331 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="northd" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014340 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="ovn-controller" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.014349 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerName="nbdb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.016038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5cd\" (UniqueName: \"kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052550 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052569 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052624 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash" (OuterVolumeSpecName: "host-slash") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052676 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052699 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052793 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052898 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052924 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052942 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052995 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log" (OuterVolumeSpecName: "node-log") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053063 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.052999 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053112 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053224 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053253 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket\") pod \"20b621ff-e3f4-40ce-9c77-2292304e36af\" (UID: \"20b621ff-e3f4-40ce-9c77-2292304e36af\") " Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053371 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket" (OuterVolumeSpecName: "log-socket") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053761 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-netd\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053780 4790 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-node-log\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053791 4790 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053804 4790 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053818 4790 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-systemd-units\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053834 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053862 4790 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-log-socket\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053873 4790 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-kubelet\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053884 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053894 4790 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-slash\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053905 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-env-overrides\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053915 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-run-netns\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053927 4790 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053938 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b621ff-e3f4-40ce-9c77-2292304e36af-ovnkube-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053948 4790 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-openvswitch\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053958 4790 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-ovn\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.053967 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-host-cni-bin\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.057447 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd" (OuterVolumeSpecName: "kube-api-access-rk5cd") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "kube-api-access-rk5cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.057480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.065022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "20b621ff-e3f4-40ce-9c77-2292304e36af" (UID: "20b621ff-e3f4-40ce-9c77-2292304e36af"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.155707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-slash\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.155797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.155902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qb6\" (UniqueName: \"kubernetes.io/projected/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-kube-api-access-c8qb6\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.155971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-ovn\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovn-node-metrics-cert\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-log-socket\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156289 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-netd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-config\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156394 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-systemd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-kubelet\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-node-log\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156521 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-bin\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156562 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-systemd-units\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-etc-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-var-lib-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-env-overrides\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156732 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156753 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-netns\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-script-lib\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156873 4790 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/20b621ff-e3f4-40ce-9c77-2292304e36af-run-systemd\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156887 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b621ff-e3f4-40ce-9c77-2292304e36af-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.156896 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5cd\" (UniqueName: \"kubernetes.io/projected/20b621ff-e3f4-40ce-9c77-2292304e36af-kube-api-access-rk5cd\") on node \"crc\" DevicePath \"\"" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-log-socket\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-netd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-config\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-log-socket\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-systemd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258583 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-kubelet\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258588 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-systemd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-node-log\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258624 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-netd\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-node-log\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-bin\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-cni-bin\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258705 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-systemd-units\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-systemd-units\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258778 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-etc-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258754 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-etc-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-kubelet\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258829 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-var-lib-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-var-lib-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258899 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-env-overrides\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-netns\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.258983 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-script-lib\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-netns\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-slash\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259047 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-run-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-slash\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qb6\" (UniqueName: \"kubernetes.io/projected/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-kube-api-access-c8qb6\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-ovn\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259176 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-openvswitch\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259186 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259219 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259262 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-run-ovn\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovn-node-metrics-cert\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-env-overrides\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.259961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-script-lib\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.260276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovnkube-config\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.264835 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-ovn-node-metrics-cert\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.279018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qb6\" (UniqueName: \"kubernetes.io/projected/3a9a867b-2a6c-49b5-a81c-5c1bd2918da8-kube-api-access-c8qb6\") pod \"ovnkube-node-f4lsb\" (UID: \"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8\") " pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.329469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:53 crc kubenswrapper[4790]: W0406 12:07:53.347275 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9a867b_2a6c_49b5_a81c_5c1bd2918da8.slice/crio-eb3404829a406c8f827a906607542ae55bcd3c222539fa229381255c2dba8a25 WatchSource:0}: Error finding container eb3404829a406c8f827a906607542ae55bcd3c222539fa229381255c2dba8a25: Status 404 returned error can't find the container with id eb3404829a406c8f827a906607542ae55bcd3c222539fa229381255c2dba8a25 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.524532 4790 generic.go:334] "Generic (PLEG): container finished" podID="3a9a867b-2a6c-49b5-a81c-5c1bd2918da8" containerID="8a2741813cd8d7805762181a5d7cfd14c8587a1219b6971938f580f01acecc26" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.524650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerDied","Data":"8a2741813cd8d7805762181a5d7cfd14c8587a1219b6971938f580f01acecc26"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.524693 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"eb3404829a406c8f827a906607542ae55bcd3c222539fa229381255c2dba8a25"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.530834 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dskdf_d912ce2d-76e2-4f0a-ae77-91adf71ddfc0/kube-multus/0.log" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.530909 4790 generic.go:334] "Generic (PLEG): container finished" podID="d912ce2d-76e2-4f0a-ae77-91adf71ddfc0" containerID="85983ff10b47fb8052c360f6676528d359e3f3bc9376a17b6137543728ee190b" exitCode=2 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.530975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdf" event={"ID":"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0","Type":"ContainerDied","Data":"85983ff10b47fb8052c360f6676528d359e3f3bc9376a17b6137543728ee190b"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.531327 4790 scope.go:117] "RemoveContainer" containerID="85983ff10b47fb8052c360f6676528d359e3f3bc9376a17b6137543728ee190b" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.538418 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5c5h9_20b621ff-e3f4-40ce-9c77-2292304e36af/ovn-acl-logging/0.log" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539171 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5c5h9_20b621ff-e3f4-40ce-9c77-2292304e36af/ovn-controller/0.log" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539910 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539942 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539957 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539978 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.539991 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.540005 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" exitCode=0 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.540018 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" exitCode=143 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.540154 4790 generic.go:334] "Generic (PLEG): container finished" podID="20b621ff-e3f4-40ce-9c77-2292304e36af" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" exitCode=143 Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.540227 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.540814 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.541231 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543636 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543677 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543687 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543693 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543707 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543713 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543720 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543725 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543730 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543735 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543742 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543747 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543752 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543766 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543773 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543777 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543782 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543788 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543793 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543799 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543804 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543810 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5c5h9" event={"ID":"20b621ff-e3f4-40ce-9c77-2292304e36af","Type":"ContainerDied","Data":"dd68895988583af3c2840fed35c366a402a23711879ff2077e1804c4643d70b7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543828 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543846 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543852 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543858 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543862 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543868 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543873 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543878 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543882 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.543896 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.604007 4790 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(0d74bdfe54338458d415e8b736c05ff39cd812520e72b551a0b9dbbb98d4e3c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.604346 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(0d74bdfe54338458d415e8b736c05ff39cd812520e72b551a0b9dbbb98d4e3c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.604369 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(0d74bdfe54338458d415e8b736c05ff39cd812520e72b551a0b9dbbb98d4e3c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.604417 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-11-crc_openshift-kube-apiserver(ae18ba0c-922e-43b1-8a32-85e498925b20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-11-crc_openshift-kube-apiserver(ae18ba0c-922e-43b1-8a32-85e498925b20)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-11-crc_openshift-kube-apiserver_ae18ba0c-922e-43b1-8a32-85e498925b20_0(0d74bdfe54338458d415e8b736c05ff39cd812520e72b551a0b9dbbb98d4e3c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-kube-apiserver/installer-11-crc" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.629809 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.668379 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.689528 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5c5h9"] Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.690187 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5c5h9"] Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.702301 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.718012 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.735665 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.759164 4790 scope.go:117] "RemoveContainer" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.803480 4790 scope.go:117] "RemoveContainer" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.829364 4790 scope.go:117] "RemoveContainer" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.853543 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.855947 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.855994 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} err="failed to get container status \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.856020 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.856387 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.856423 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} err="failed to get container status \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.856446 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.856724 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.856755 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} err="failed to get container status \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.856775 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.857524 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.857553 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} err="failed to get container status \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.857567 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.857821 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.857861 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} err="failed to get container status \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.857875 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.858262 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.858293 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} err="failed to get container status \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.858308 4790 scope.go:117] "RemoveContainer" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.858575 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": container with ID starting with cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7 not found: ID does not exist" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.858598 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} err="failed to get container status \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": rpc error: code = NotFound desc = could not find container \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": container with ID starting with cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.858618 4790 scope.go:117] "RemoveContainer" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.859035 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": container with ID starting with 6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93 not found: ID does not exist" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859064 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} err="failed to get container status \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": rpc error: code = NotFound desc = could not find container \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": container with ID starting with 6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859080 4790 scope.go:117] "RemoveContainer" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: E0406 12:07:53.859373 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": container with ID starting with 3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d not found: ID does not exist" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859395 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} err="failed to get container status \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": rpc error: code = NotFound desc = could not find container \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": container with ID starting with 3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859409 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859683 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} err="failed to get container status \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859706 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859945 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} err="failed to get container status \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.859963 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.860259 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} err="failed to get container status \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.860280 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.860712 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} err="failed to get container status \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.860733 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861094 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} err="failed to get container status \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861115 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861386 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} err="failed to get container status \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861405 4790 scope.go:117] "RemoveContainer" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861690 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} err="failed to get container status \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": rpc error: code = NotFound desc = could not find container \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": container with ID starting with cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861716 4790 scope.go:117] "RemoveContainer" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.861989 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} err="failed to get container status \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": rpc error: code = NotFound desc = could not find container \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": container with ID starting with 6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862011 4790 scope.go:117] "RemoveContainer" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862263 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} err="failed to get container status \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": rpc error: code = NotFound desc = could not find container \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": container with ID starting with 3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862282 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862557 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} err="failed to get container status \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862579 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862879 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} err="failed to get container status \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.862897 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863118 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} err="failed to get container status \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863134 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863503 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} err="failed to get container status \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863522 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863830 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} err="failed to get container status \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.863886 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864325 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} err="failed to get container status \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864348 4790 scope.go:117] "RemoveContainer" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864550 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} err="failed to get container status \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": rpc error: code = NotFound desc = could not find container \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": container with ID starting with cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864568 4790 scope.go:117] "RemoveContainer" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864740 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} err="failed to get container status \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": rpc error: code = NotFound desc = could not find container \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": container with ID starting with 6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.864867 4790 scope.go:117] "RemoveContainer" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.865249 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} err="failed to get container status \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": rpc error: code = NotFound desc = could not find container \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": container with ID starting with 3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.865275 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.865516 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} err="failed to get container status \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.865547 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.866053 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} err="failed to get container status \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.866074 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.867442 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} err="failed to get container status \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.867471 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.868589 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} err="failed to get container status \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.868636 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869058 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} err="failed to get container status \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869081 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869465 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} err="failed to get container status \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869487 4790 scope.go:117] "RemoveContainer" containerID="cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869846 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7"} err="failed to get container status \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": rpc error: code = NotFound desc = could not find container \"cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7\": container with ID starting with cbf3db1d2f5285c947fa836df378a468f9f49e93a9c89d865e76f2c02ace72f7 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.869868 4790 scope.go:117] "RemoveContainer" containerID="6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870222 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93"} err="failed to get container status \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": rpc error: code = NotFound desc = could not find container \"6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93\": container with ID starting with 6607ace28beffcd3015c9d6aacac86facb4227434209eff15699e22985891a93 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870244 4790 scope.go:117] "RemoveContainer" containerID="3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870573 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d"} err="failed to get container status \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": rpc error: code = NotFound desc = could not find container \"3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d\": container with ID starting with 3a90d8de1577353467fe6f4535922584ccd8b0f692294836cec452edc467982d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870593 4790 scope.go:117] "RemoveContainer" containerID="4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870910 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8"} err="failed to get container status \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": rpc error: code = NotFound desc = could not find container \"4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8\": container with ID starting with 4dec304116694cd671bb7e6e86e2ae9bac6499d67a95ea1e2eec6cc603842cc8 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.870932 4790 scope.go:117] "RemoveContainer" containerID="086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871241 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a"} err="failed to get container status \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": rpc error: code = NotFound desc = could not find container \"086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a\": container with ID starting with 086b96a34ba93f5badb85ef9281ca6c8cacc68d51011002f3e327618618c285a not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871260 4790 scope.go:117] "RemoveContainer" containerID="1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871638 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d"} err="failed to get container status \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": rpc error: code = NotFound desc = could not find container \"1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d\": container with ID starting with 1a36328526e405b180d39ff432fffcadbcf0cdee52bd2cd9cdfc472a12e5026d not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871672 4790 scope.go:117] "RemoveContainer" containerID="a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871923 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455"} err="failed to get container status \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": rpc error: code = NotFound desc = could not find container \"a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455\": container with ID starting with a7c91a4800442677d0feecbfc33cedec4e283673a416a35538df242301cda455 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.871938 4790 scope.go:117] "RemoveContainer" containerID="5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.872538 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319"} err="failed to get container status \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": rpc error: code = NotFound desc = could not find container \"5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319\": container with ID starting with 5f5c7e7d3b27d92aadaa1de37dfad729d15c4d7307bc1dc74dc6eab719f39319 not found: ID does not exist" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.872580 4790 scope.go:117] "RemoveContainer" containerID="2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3" Apr 06 12:07:53 crc kubenswrapper[4790]: I0406 12:07:53.873071 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3"} err="failed to get container status \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": rpc error: code = NotFound desc = could not find container \"2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3\": container with ID starting with 2e8ba36099c7d4dcde9de1921a854deffb657a09d0efa74cfef4c6e14ab1c9e3 not found: ID does not exist" Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.553706 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"24a097551c754a39aed0e4a84fddf06b998666861dcbd72c5109b23932fa4224"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.555171 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"5deeefea5449531f46fce91bdd7c576eab4f1840be3b23689d58a67afa076100"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.555269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"c36c5d9ffe19e22fbeef7e44ef9c5cbd2453ff4bf6466a7826a330109545bac8"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.555373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"d7fbf2bc74adf88247ba80b323439e76c4f80f3d1965ab2d5b1744870a0156a5"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.555502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"6f7bb802dcaa226c5f6f70031260e13a29fc884426173a16d93d7130233b3013"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.555620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"ee0d7e8172fe80bb9ae579b4d3d1a4499152483b037723e3ca3261680fd01a84"} Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.557264 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dskdf_d912ce2d-76e2-4f0a-ae77-91adf71ddfc0/kube-multus/0.log" Apr 06 12:07:54 crc kubenswrapper[4790]: I0406 12:07:54.557352 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dskdf" event={"ID":"d912ce2d-76e2-4f0a-ae77-91adf71ddfc0","Type":"ContainerStarted","Data":"8b72cb69fb97c605fd06d781cc9fe4d1871868c76375138c28999bfee013e0da"} Apr 06 12:07:55 crc kubenswrapper[4790]: I0406 12:07:55.689261 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b621ff-e3f4-40ce-9c77-2292304e36af" path="/var/lib/kubelet/pods/20b621ff-e3f4-40ce-9c77-2292304e36af/volumes" Apr 06 12:07:56 crc kubenswrapper[4790]: I0406 12:07:56.578353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"ef7ca42c378b25c62ae621547789ce52ae3619145bcd9a49a844b3dad83a2ea5"} Apr 06 12:07:59 crc kubenswrapper[4790]: I0406 12:07:59.602969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" event={"ID":"3a9a867b-2a6c-49b5-a81c-5c1bd2918da8","Type":"ContainerStarted","Data":"1ed03812c2cf35235e4266415c419804d7a3df9ce07b6800b6cf8ca44eaa7e8f"} Apr 06 12:07:59 crc kubenswrapper[4790]: I0406 12:07:59.604024 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:07:59 crc kubenswrapper[4790]: I0406 12:07:59.634100 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" podStartSLOduration=7.634075569 podStartE2EDuration="7.634075569s" podCreationTimestamp="2026-04-06 12:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:07:59.629993708 +0000 UTC m=+658.617736614" watchObservedRunningTime="2026-04-06 12:07:59.634075569 +0000 UTC m=+658.621818475" Apr 06 12:07:59 crc kubenswrapper[4790]: I0406 12:07:59.636379 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.176470 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591288-k92zx"] Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.177530 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.180114 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.180198 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.180222 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.280030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjxh\" (UniqueName: \"kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh\") pod \"auto-csr-approver-29591288-k92zx\" (UID: \"ebfb6550-5136-437e-addd-e7f853434761\") " pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.381528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjxh\" (UniqueName: \"kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh\") pod \"auto-csr-approver-29591288-k92zx\" (UID: \"ebfb6550-5136-437e-addd-e7f853434761\") " pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.407593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjxh\" (UniqueName: \"kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh\") pod \"auto-csr-approver-29591288-k92zx\" (UID: \"ebfb6550-5136-437e-addd-e7f853434761\") " pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.497041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.540979 4790 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(41a9d8dac4d382e4e23b85c5de12d115639177e7a409c87a6782e5012ccde388): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.541050 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(41a9d8dac4d382e4e23b85c5de12d115639177e7a409c87a6782e5012ccde388): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.541071 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(41a9d8dac4d382e4e23b85c5de12d115639177e7a409c87a6782e5012ccde388): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.541117 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29591288-k92zx_openshift-infra(ebfb6550-5136-437e-addd-e7f853434761)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29591288-k92zx_openshift-infra(ebfb6550-5136-437e-addd-e7f853434761)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(41a9d8dac4d382e4e23b85c5de12d115639177e7a409c87a6782e5012ccde388): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29591288-k92zx" podUID="ebfb6550-5136-437e-addd-e7f853434761" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.610740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.610787 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.644689 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.812820 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591288-k92zx"] Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.813082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: I0406 12:08:00.814725 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.844638 4790 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(8752299610ddc90df711fca40895d857d01ad7a6ed3de9d2422b5d8495fae48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.844697 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(8752299610ddc90df711fca40895d857d01ad7a6ed3de9d2422b5d8495fae48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.844717 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(8752299610ddc90df711fca40895d857d01ad7a6ed3de9d2422b5d8495fae48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:00 crc kubenswrapper[4790]: E0406 12:08:00.844761 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29591288-k92zx_openshift-infra(ebfb6550-5136-437e-addd-e7f853434761)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29591288-k92zx_openshift-infra(ebfb6550-5136-437e-addd-e7f853434761)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29591288-k92zx_openshift-infra_ebfb6550-5136-437e-addd-e7f853434761_0(8752299610ddc90df711fca40895d857d01ad7a6ed3de9d2422b5d8495fae48f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29591288-k92zx" podUID="ebfb6550-5136-437e-addd-e7f853434761" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.691638 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692163 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" containerID="cri-o://642a8714da2ce927c59666c6a55e69c883f9283af786c866dcf1219b7c295bf8" gracePeriod=30 Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692179 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" containerID="cri-o://efb99e3c83fa8b086e34268cf2ed4f95342b9e349fd28152e399c332bc18f5ff" gracePeriod=30 Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692231 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" containerID="cri-o://2a0bda650d85589544b66809493d1f094798ba4128af4572a641b923de6a1128" gracePeriod=30 Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692562 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:08:02 crc kubenswrapper[4790]: E0406 12:08:02.692732 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="wait-for-host-port" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692743 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="wait-for-host-port" Apr 06 12:08:02 crc kubenswrapper[4790]: E0406 12:08:02.692753 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692759 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 06 12:08:02 crc kubenswrapper[4790]: E0406 12:08:02.692770 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 06 12:08:02 crc kubenswrapper[4790]: E0406 12:08:02.692797 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692802 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692920 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-recovery-controller" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692933 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.692948 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="815516d0756bb9282f4d0a28cef72670" containerName="kube-scheduler-cert-syncer" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.813012 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.813541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.859980 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_815516d0756bb9282f4d0a28cef72670/kube-scheduler-cert-syncer/0.log" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.860798 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.863783 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="815516d0756bb9282f4d0a28cef72670" podUID="d8fd3797d07faa04d98c33c6c96ee09f" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.914904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.915005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.915090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:02 crc kubenswrapper[4790]: I0406 12:08:02.915162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d8fd3797d07faa04d98c33c6c96ee09f-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"d8fd3797d07faa04d98c33c6c96ee09f\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.015534 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") pod \"815516d0756bb9282f4d0a28cef72670\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.015566 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") pod \"815516d0756bb9282f4d0a28cef72670\" (UID: \"815516d0756bb9282f4d0a28cef72670\") " Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.015667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "815516d0756bb9282f4d0a28cef72670" (UID: "815516d0756bb9282f4d0a28cef72670"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.015696 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "815516d0756bb9282f4d0a28cef72670" (UID: "815516d0756bb9282f4d0a28cef72670"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.016044 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.016074 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/815516d0756bb9282f4d0a28cef72670-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.645080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_815516d0756bb9282f4d0a28cef72670/kube-scheduler-cert-syncer/0.log" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.645939 4790 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="efb99e3c83fa8b086e34268cf2ed4f95342b9e349fd28152e399c332bc18f5ff" exitCode=0 Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.645971 4790 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="2a0bda650d85589544b66809493d1f094798ba4128af4572a641b923de6a1128" exitCode=2 Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.645980 4790 generic.go:334] "Generic (PLEG): container finished" podID="815516d0756bb9282f4d0a28cef72670" containerID="642a8714da2ce927c59666c6a55e69c883f9283af786c866dcf1219b7c295bf8" exitCode=0 Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.646060 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d9079c674d04974dfecf9717c58fab50170503deb7b33c8e7d8e10d80c10e38" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.646053 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.648779 4790 generic.go:334] "Generic (PLEG): container finished" podID="6fc4654d-bb99-4a95-83c7-0769baa96461" containerID="1feff926f638a5c01012b16744470c633e45676a15883b966c139f19d2df7df8" exitCode=0 Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.648811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"6fc4654d-bb99-4a95-83c7-0769baa96461","Type":"ContainerDied","Data":"1feff926f638a5c01012b16744470c633e45676a15883b966c139f19d2df7df8"} Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.650087 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="815516d0756bb9282f4d0a28cef72670" podUID="d8fd3797d07faa04d98c33c6c96ee09f" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.669964 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" oldPodUID="815516d0756bb9282f4d0a28cef72670" podUID="d8fd3797d07faa04d98c33c6c96ee09f" Apr 06 12:08:03 crc kubenswrapper[4790]: I0406 12:08:03.685783 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815516d0756bb9282f4d0a28cef72670" path="/var/lib/kubelet/pods/815516d0756bb9282f4d0a28cef72670/volumes" Apr 06 12:08:04 crc kubenswrapper[4790]: I0406 12:08:04.674963 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:08:04 crc kubenswrapper[4790]: I0406 12:08:04.675848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:08:04 crc kubenswrapper[4790]: I0406 12:08:04.901965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:08:04 crc kubenswrapper[4790]: I0406 12:08:04.932135 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-11-crc"] Apr 06 12:08:04 crc kubenswrapper[4790]: W0406 12:08:04.943218 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podae18ba0c_922e_43b1_8a32_85e498925b20.slice/crio-e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db WatchSource:0}: Error finding container e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db: Status 404 returned error can't find the container with id e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.046301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access\") pod \"6fc4654d-bb99-4a95-83c7-0769baa96461\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.046518 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir\") pod \"6fc4654d-bb99-4a95-83c7-0769baa96461\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.046546 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock\") pod \"6fc4654d-bb99-4a95-83c7-0769baa96461\" (UID: \"6fc4654d-bb99-4a95-83c7-0769baa96461\") " Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.046986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock" (OuterVolumeSpecName: "var-lock") pod "6fc4654d-bb99-4a95-83c7-0769baa96461" (UID: "6fc4654d-bb99-4a95-83c7-0769baa96461"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.047040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6fc4654d-bb99-4a95-83c7-0769baa96461" (UID: "6fc4654d-bb99-4a95-83c7-0769baa96461"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.057564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6fc4654d-bb99-4a95-83c7-0769baa96461" (UID: "6fc4654d-bb99-4a95-83c7-0769baa96461"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.148059 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.148115 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6fc4654d-bb99-4a95-83c7-0769baa96461-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.148128 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fc4654d-bb99-4a95-83c7-0769baa96461-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.664623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-8-crc" event={"ID":"6fc4654d-bb99-4a95-83c7-0769baa96461","Type":"ContainerDied","Data":"48577e7977621b0e3ecbf622e90819d0e6eaf9fb3761e138f40f28f37e9da451"} Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.664664 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48577e7977621b0e3ecbf622e90819d0e6eaf9fb3761e138f40f28f37e9da451" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.664713 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-8-crc" Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.668135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"ae18ba0c-922e-43b1-8a32-85e498925b20","Type":"ContainerStarted","Data":"c8df8fbe7a1f578b66ea6ed11ca12c75a5be5a651b8fb283ee808bb04a0a6c41"} Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.668205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"ae18ba0c-922e-43b1-8a32-85e498925b20","Type":"ContainerStarted","Data":"e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db"} Apr 06 12:08:05 crc kubenswrapper[4790]: I0406 12:08:05.696395 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-11-crc" podStartSLOduration=13.696355572 podStartE2EDuration="13.696355572s" podCreationTimestamp="2026-04-06 12:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:08:05.690986766 +0000 UTC m=+664.678729632" watchObservedRunningTime="2026-04-06 12:08:05.696355572 +0000 UTC m=+664.684098478" Apr 06 12:08:09 crc kubenswrapper[4790]: I0406 12:08:09.753465 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:08:09 crc kubenswrapper[4790]: I0406 12:08:09.754050 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:08:09 crc kubenswrapper[4790]: I0406 12:08:09.754094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:08:09 crc kubenswrapper[4790]: I0406 12:08:09.754606 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:08:09 crc kubenswrapper[4790]: I0406 12:08:09.754657 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df" gracePeriod=600 Apr 06 12:08:10 crc kubenswrapper[4790]: I0406 12:08:10.699751 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df" exitCode=0 Apr 06 12:08:10 crc kubenswrapper[4790]: I0406 12:08:10.699799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df"} Apr 06 12:08:10 crc kubenswrapper[4790]: I0406 12:08:10.700059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622"} Apr 06 12:08:10 crc kubenswrapper[4790]: I0406 12:08:10.700078 4790 scope.go:117] "RemoveContainer" containerID="f074a0dbf2caed5f314b385c985d923b4f08f77e264f1213a2b7855601973ce2" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.672592 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.672949 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" containerID="cri-o://fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" gracePeriod=30 Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.673039 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" gracePeriod=30 Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.673099 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" containerID="cri-o://37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" gracePeriod=30 Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.673184 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" gracePeriod=30 Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.692370 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:08:11 crc kubenswrapper[4790]: E0406 12:08:11.693046 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693078 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: E0406 12:08:11.693096 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc4654d-bb99-4a95-83c7-0769baa96461" containerName="installer" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693109 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc4654d-bb99-4a95-83c7-0769baa96461" containerName="installer" Apr 06 12:08:11 crc kubenswrapper[4790]: E0406 12:08:11.693138 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693151 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: E0406 12:08:11.693172 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693184 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 06 12:08:11 crc kubenswrapper[4790]: E0406 12:08:11.693202 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693215 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693391 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-cert-syncer" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693418 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="cluster-policy-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693434 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693452 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="235e9295064844132a05dc40ef3a886a" containerName="kube-controller-manager-recovery-controller" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.693468 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc4654d-bb99-4a95-83c7-0769baa96461" containerName="installer" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.833887 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.834049 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.864315 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_235e9295064844132a05dc40ef3a886a/kube-controller-manager-cert-syncer/0.log" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.866583 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.869707 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.934946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.935047 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.935049 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:11 crc kubenswrapper[4790]: I0406 12:08:11.935121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c32a96981201f35bdc64ba062620676a-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"c32a96981201f35bdc64ba062620676a\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.035788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") pod \"235e9295064844132a05dc40ef3a886a\" (UID: \"235e9295064844132a05dc40ef3a886a\") " Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.035877 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") pod \"235e9295064844132a05dc40ef3a886a\" (UID: \"235e9295064844132a05dc40ef3a886a\") " Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.035973 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "235e9295064844132a05dc40ef3a886a" (UID: "235e9295064844132a05dc40ef3a886a"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.036053 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "235e9295064844132a05dc40ef3a886a" (UID: "235e9295064844132a05dc40ef3a886a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.036166 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.036181 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/235e9295064844132a05dc40ef3a886a-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.732031 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_235e9295064844132a05dc40ef3a886a/kube-controller-manager-cert-syncer/0.log" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734406 4790 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" exitCode=0 Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734431 4790 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" exitCode=2 Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734438 4790 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" exitCode=0 Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734445 4790 generic.go:334] "Generic (PLEG): container finished" podID="235e9295064844132a05dc40ef3a886a" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" exitCode=0 Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734536 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.734608 4790 scope.go:117] "RemoveContainer" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.741241 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.741713 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b10f590-77e2-4257-8c4a-0ef40347ddc9" containerID="d2e04a9e44862dff785782bb2d076adab360972dd651708174bf793e9e1f5bc6" exitCode=0 Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.741751 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"7b10f590-77e2-4257-8c4a-0ef40347ddc9","Type":"ContainerDied","Data":"d2e04a9e44862dff785782bb2d076adab360972dd651708174bf793e9e1f5bc6"} Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.775407 4790 scope.go:117] "RemoveContainer" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.775971 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-crc" oldPodUID="235e9295064844132a05dc40ef3a886a" podUID="c32a96981201f35bdc64ba062620676a" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.794008 4790 scope.go:117] "RemoveContainer" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.810539 4790 scope.go:117] "RemoveContainer" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.824655 4790 scope.go:117] "RemoveContainer" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: E0406 12:08:12.825303 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": container with ID starting with 2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5 not found: ID does not exist" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.825363 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5"} err="failed to get container status \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": rpc error: code = NotFound desc = could not find container \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": container with ID starting with 2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.825403 4790 scope.go:117] "RemoveContainer" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: E0406 12:08:12.825803 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": container with ID starting with 99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247 not found: ID does not exist" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.825854 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247"} err="failed to get container status \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": rpc error: code = NotFound desc = could not find container \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": container with ID starting with 99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.825873 4790 scope.go:117] "RemoveContainer" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: E0406 12:08:12.826091 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": container with ID starting with 37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c not found: ID does not exist" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826119 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c"} err="failed to get container status \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": rpc error: code = NotFound desc = could not find container \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": container with ID starting with 37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826137 4790 scope.go:117] "RemoveContainer" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: E0406 12:08:12.826439 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": container with ID starting with fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14 not found: ID does not exist" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826468 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14"} err="failed to get container status \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": rpc error: code = NotFound desc = could not find container \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": container with ID starting with fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826485 4790 scope.go:117] "RemoveContainer" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826715 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5"} err="failed to get container status \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": rpc error: code = NotFound desc = could not find container \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": container with ID starting with 2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826748 4790 scope.go:117] "RemoveContainer" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.826990 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247"} err="failed to get container status \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": rpc error: code = NotFound desc = could not find container \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": container with ID starting with 99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827019 4790 scope.go:117] "RemoveContainer" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827235 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c"} err="failed to get container status \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": rpc error: code = NotFound desc = could not find container \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": container with ID starting with 37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827303 4790 scope.go:117] "RemoveContainer" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827547 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14"} err="failed to get container status \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": rpc error: code = NotFound desc = could not find container \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": container with ID starting with fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827573 4790 scope.go:117] "RemoveContainer" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827843 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5"} err="failed to get container status \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": rpc error: code = NotFound desc = could not find container \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": container with ID starting with 2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.827869 4790 scope.go:117] "RemoveContainer" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828190 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247"} err="failed to get container status \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": rpc error: code = NotFound desc = could not find container \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": container with ID starting with 99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828225 4790 scope.go:117] "RemoveContainer" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828470 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c"} err="failed to get container status \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": rpc error: code = NotFound desc = could not find container \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": container with ID starting with 37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828498 4790 scope.go:117] "RemoveContainer" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828845 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14"} err="failed to get container status \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": rpc error: code = NotFound desc = could not find container \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": container with ID starting with fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.828961 4790 scope.go:117] "RemoveContainer" containerID="2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829250 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5"} err="failed to get container status \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": rpc error: code = NotFound desc = could not find container \"2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5\": container with ID starting with 2d6404c30bee2f2352e7b6b92c13e94b9ea62e518452a5169d89768373c24ae5 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829281 4790 scope.go:117] "RemoveContainer" containerID="99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829494 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247"} err="failed to get container status \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": rpc error: code = NotFound desc = could not find container \"99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247\": container with ID starting with 99721156a055960df0fd93455fbee5acc656766e21ef84e601ee8cc4d1ce1247 not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829515 4790 scope.go:117] "RemoveContainer" containerID="37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829752 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c"} err="failed to get container status \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": rpc error: code = NotFound desc = could not find container \"37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c\": container with ID starting with 37bb0673e3ae453ce04880e0fe8110e54b1af81b90091b77e390fc97afd6e08c not found: ID does not exist" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829769 4790 scope.go:117] "RemoveContainer" containerID="fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14" Apr 06 12:08:12 crc kubenswrapper[4790]: I0406 12:08:12.829966 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14"} err="failed to get container status \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": rpc error: code = NotFound desc = could not find container \"fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14\": container with ID starting with fca03506c364ae461d141606e8205126504b45257bd6663d2121b737ce065a14 not found: ID does not exist" Apr 06 12:08:13 crc kubenswrapper[4790]: I0406 12:08:13.674730 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:13 crc kubenswrapper[4790]: I0406 12:08:13.675908 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:13 crc kubenswrapper[4790]: I0406 12:08:13.688200 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235e9295064844132a05dc40ef3a886a" path="/var/lib/kubelet/pods/235e9295064844132a05dc40ef3a886a/volumes" Apr 06 12:08:13 crc kubenswrapper[4790]: I0406 12:08:13.973608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591288-k92zx"] Apr 06 12:08:13 crc kubenswrapper[4790]: W0406 12:08:13.978559 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfb6550_5136_437e_addd_e7f853434761.slice/crio-56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426 WatchSource:0}: Error finding container 56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426: Status 404 returned error can't find the container with id 56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426 Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.038192 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.165345 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir\") pod \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.165506 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access\") pod \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.165637 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock\") pod \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\" (UID: \"7b10f590-77e2-4257-8c4a-0ef40347ddc9\") " Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.166199 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock" (OuterVolumeSpecName: "var-lock") pod "7b10f590-77e2-4257-8c4a-0ef40347ddc9" (UID: "7b10f590-77e2-4257-8c4a-0ef40347ddc9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.166280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7b10f590-77e2-4257-8c4a-0ef40347ddc9" (UID: "7b10f590-77e2-4257-8c4a-0ef40347ddc9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.173754 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7b10f590-77e2-4257-8c4a-0ef40347ddc9" (UID: "7b10f590-77e2-4257-8c4a-0ef40347ddc9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.267554 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.267583 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.267592 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b10f590-77e2-4257-8c4a-0ef40347ddc9-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.759605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-12-crc" event={"ID":"7b10f590-77e2-4257-8c4a-0ef40347ddc9","Type":"ContainerDied","Data":"2e0fdba1759fcddf10bf102b4fcf7eb82d797a81c0daa7fb31c970c924b15673"} Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.759959 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0fdba1759fcddf10bf102b4fcf7eb82d797a81c0daa7fb31c970c924b15673" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.759641 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-12-crc" Apr 06 12:08:14 crc kubenswrapper[4790]: I0406 12:08:14.761513 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591288-k92zx" event={"ID":"ebfb6550-5136-437e-addd-e7f853434761","Type":"ContainerStarted","Data":"56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426"} Apr 06 12:08:15 crc kubenswrapper[4790]: I0406 12:08:15.770414 4790 generic.go:334] "Generic (PLEG): container finished" podID="ebfb6550-5136-437e-addd-e7f853434761" containerID="9456a15313b46c7fb1911b9e91f811cf90ae1425ab434e20d38deab25e3bd0e4" exitCode=0 Apr 06 12:08:15 crc kubenswrapper[4790]: I0406 12:08:15.770453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591288-k92zx" event={"ID":"ebfb6550-5136-437e-addd-e7f853434761","Type":"ContainerDied","Data":"9456a15313b46c7fb1911b9e91f811cf90ae1425ab434e20d38deab25e3bd0e4"} Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.674475 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.692393 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3a8e73ed-0fe6-4428-be57-9a2f6167deca" Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.692428 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3a8e73ed-0fe6-4428-be57-9a2f6167deca" Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.703644 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.703864 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.709215 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.720366 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.724983 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Apr 06 12:08:16 crc kubenswrapper[4790]: W0406 12:08:16.739228 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fd3797d07faa04d98c33c6c96ee09f.slice/crio-efa809609c056ec6c299ae320358f967da9f084caa468ac3357ad58b10ad2e03 WatchSource:0}: Error finding container efa809609c056ec6c299ae320358f967da9f084caa468ac3357ad58b10ad2e03: Status 404 returned error can't find the container with id efa809609c056ec6c299ae320358f967da9f084caa468ac3357ad58b10ad2e03 Apr 06 12:08:16 crc kubenswrapper[4790]: I0406 12:08:16.790430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"efa809609c056ec6c299ae320358f967da9f084caa468ac3357ad58b10ad2e03"} Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.027772 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.199455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjxh\" (UniqueName: \"kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh\") pod \"ebfb6550-5136-437e-addd-e7f853434761\" (UID: \"ebfb6550-5136-437e-addd-e7f853434761\") " Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.206173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh" (OuterVolumeSpecName: "kube-api-access-9jjxh") pod "ebfb6550-5136-437e-addd-e7f853434761" (UID: "ebfb6550-5136-437e-addd-e7f853434761"). InnerVolumeSpecName "kube-api-access-9jjxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.301566 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjxh\" (UniqueName: \"kubernetes.io/projected/ebfb6550-5136-437e-addd-e7f853434761-kube-api-access-9jjxh\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.797503 4790 generic.go:334] "Generic (PLEG): container finished" podID="d8fd3797d07faa04d98c33c6c96ee09f" containerID="18e0b7d16156a444f0e1b9122dac03af0bbb15f3a8167253a06916b439834767" exitCode=0 Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.797553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerDied","Data":"18e0b7d16156a444f0e1b9122dac03af0bbb15f3a8167253a06916b439834767"} Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.800889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591288-k92zx" event={"ID":"ebfb6550-5136-437e-addd-e7f853434761","Type":"ContainerDied","Data":"56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426"} Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.800922 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56142bad534c992b6249d18a73755cc3e773201790d118419e10f571cc6a2426" Apr 06 12:08:17 crc kubenswrapper[4790]: I0406 12:08:17.800957 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591288-k92zx" Apr 06 12:08:18 crc kubenswrapper[4790]: I0406 12:08:18.809580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"bdf0bb53118c0180b1e07eb60f3eeef289f51aacc5b1c8a1f8e826743d3f95b9"} Apr 06 12:08:18 crc kubenswrapper[4790]: I0406 12:08:18.809997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:08:18 crc kubenswrapper[4790]: I0406 12:08:18.810013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"ab81a3202c3e474d0f5ed67cca7ac747aef1bc6bf52773277d4a9b53f2cba662"} Apr 06 12:08:18 crc kubenswrapper[4790]: I0406 12:08:18.810022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"d8fd3797d07faa04d98c33c6c96ee09f","Type":"ContainerStarted","Data":"fc16309233f50d0b38d3b62c57d56b8e92ac2ba8b125e41e945518800d1891e5"} Apr 06 12:08:18 crc kubenswrapper[4790]: I0406 12:08:18.824814 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.824797467 podStartE2EDuration="2.824797467s" podCreationTimestamp="2026-04-06 12:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:08:18.821434425 +0000 UTC m=+677.809177301" watchObservedRunningTime="2026-04-06 12:08:18.824797467 +0000 UTC m=+677.812540333" Apr 06 12:08:23 crc kubenswrapper[4790]: I0406 12:08:23.366462 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f4lsb" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.675525 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.694141 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="60e524eb-aafd-4242-a8e9-af1945990db1" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.694192 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="60e524eb-aafd-4242-a8e9-af1945990db1" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.722768 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.728528 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.744222 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.746922 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.756006 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Apr 06 12:08:25 crc kubenswrapper[4790]: I0406 12:08:25.880924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"a3c451803f540c3623b8acdc791c1142aa3facb081ffa5e2acfbb7001d94a275"} Apr 06 12:08:26 crc kubenswrapper[4790]: I0406 12:08:26.892974 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"7e4ad3cde881bce8fe5301e525a449a7e5992d8f1ad5a9db3f6b620473cacce3"} Apr 06 12:08:26 crc kubenswrapper[4790]: I0406 12:08:26.893430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"ac93985a08c5317db777a2f3d7d00aab83a570e3bb7c0513a028eaf8a1cf6e98"} Apr 06 12:08:26 crc kubenswrapper[4790]: I0406 12:08:26.893448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"80d9db9518f59ffc3216332fc3302fd42a71cdb6311602297ed02e15a18c7765"} Apr 06 12:08:26 crc kubenswrapper[4790]: I0406 12:08:26.893459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"5adf08c402cd80394738951403b03779b8303ba45572cdac4893ef112f4ffd56"} Apr 06 12:08:26 crc kubenswrapper[4790]: I0406 12:08:26.921618 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.9215935229999999 podStartE2EDuration="1.921593523s" podCreationTimestamp="2026-04-06 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:08:26.91708219 +0000 UTC m=+685.904825096" watchObservedRunningTime="2026-04-06 12:08:26.921593523 +0000 UTC m=+685.909336399" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.744895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.745404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.745417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.745427 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.748841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:35 crc kubenswrapper[4790]: I0406 12:08:35.752201 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:36 crc kubenswrapper[4790]: I0406 12:08:36.968626 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:36 crc kubenswrapper[4790]: I0406 12:08:36.969479 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.909601 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw"] Apr 06 12:08:42 crc kubenswrapper[4790]: E0406 12:08:42.910132 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b10f590-77e2-4257-8c4a-0ef40347ddc9" containerName="installer" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.910152 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b10f590-77e2-4257-8c4a-0ef40347ddc9" containerName="installer" Apr 06 12:08:42 crc kubenswrapper[4790]: E0406 12:08:42.910170 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfb6550-5136-437e-addd-e7f853434761" containerName="oc" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.910179 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfb6550-5136-437e-addd-e7f853434761" containerName="oc" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.910301 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfb6550-5136-437e-addd-e7f853434761" containerName="oc" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.910319 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b10f590-77e2-4257-8c4a-0ef40347ddc9" containerName="installer" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.911291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.912680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.926214 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw"] Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.982381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.982447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2w4r\" (UniqueName: \"kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:42 crc kubenswrapper[4790]: I0406 12:08:42.982494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.083921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.084000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.084058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2w4r\" (UniqueName: \"kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.084680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.084741 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.100797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2w4r\" (UniqueName: \"kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.166402 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.167457 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.167639 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.167936 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" containerID="cri-o://1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d" gracePeriod=15 Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.167989 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46" gracePeriod=15 Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168044 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695" gracePeriod=15 Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168037 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168116 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" containerID="cri-o://47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07" gracePeriod=15 Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.167963 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527" gracePeriod=15 Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168342 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168354 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168364 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="setup" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168370 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="setup" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168381 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168389 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168401 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168408 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168415 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168421 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.168431 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168437 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168536 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-syncer" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168550 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-check-endpoints" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168558 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168566 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-insecure-readyz" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.168573 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerName="kube-apiserver-cert-regeneration-controller" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186375 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186423 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.186467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.209420 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.235635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287718 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.287985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.288012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.288041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3f04c31653fd2d52d145a959c922a0d3-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3f04c31653fd2d52d145a959c922a0d3\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.288071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: I0406 12:08:43.505695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:08:43 crc kubenswrapper[4790]: W0406 12:08:43.528370 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaec8d0ffd277c0e93001246672220ba.slice/crio-0f2ed2ab19113af2845720da6157b2a40e8c35e9a2d04cc7cad6710c21239534 WatchSource:0}: Error finding container 0f2ed2ab19113af2845720da6157b2a40e8c35e9a2d04cc7cad6710c21239534: Status 404 returned error can't find the container with id 0f2ed2ab19113af2845720da6157b2a40e8c35e9a2d04cc7cad6710c21239534 Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.532740 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a3c33807ddbf9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:aaec8d0ffd277c0e93001246672220ba,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 12:08:43.531665309 +0000 UTC m=+702.519408175,LastTimestamp:2026-04-06 12:08:43.531665309 +0000 UTC m=+702.519408175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.844425 4790 log.go:32] "RunPodSandbox from runtime service failed" err=< Apr 06 12:08:43 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2" Netns:"/var/run/netns/5b8d61b1-db9b-4e3b-9d37-813c8387c1e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:43 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:43 crc kubenswrapper[4790]: > Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.844734 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Apr 06 12:08:43 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2" Netns:"/var/run/netns/5b8d61b1-db9b-4e3b-9d37-813c8387c1e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:43 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:43 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.844754 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Apr 06 12:08:43 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2" Netns:"/var/run/netns/5b8d61b1-db9b-4e3b-9d37-813c8387c1e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:43 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:43 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:43 crc kubenswrapper[4790]: E0406 12:08:43.844816 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2\\\" Netns:\\\"/var/run/netns/5b8d61b1-db9b-4e3b-9d37-813c8387c1e3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=b901f16b276efd487a5da3505e46143dcbbb0d1b6e4e683a6790305307c59ec2;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s\\\": dial tcp 38.102.83.146:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.009354 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae18ba0c-922e-43b1-8a32-85e498925b20" containerID="c8df8fbe7a1f578b66ea6ed11ca12c75a5be5a651b8fb283ee808bb04a0a6c41" exitCode=0 Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.009444 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"ae18ba0c-922e-43b1-8a32-85e498925b20","Type":"ContainerDied","Data":"c8df8fbe7a1f578b66ea6ed11ca12c75a5be5a651b8fb283ee808bb04a0a6c41"} Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.010176 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.010545 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.014267 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.015007 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527" exitCode=0 Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.015030 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695" exitCode=0 Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.015039 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46" exitCode=0 Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.015046 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07" exitCode=2 Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.017711 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"aaec8d0ffd277c0e93001246672220ba","Type":"ContainerStarted","Data":"1c7d7cd2d4fac7c93d3413ae3e517975ee62020f57c1c145839b2f5daf2f8ccc"} Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.017762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"aaec8d0ffd277c0e93001246672220ba","Type":"ContainerStarted","Data":"0f2ed2ab19113af2845720da6157b2a40e8c35e9a2d04cc7cad6710c21239534"} Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.017724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.018417 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.018427 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:44 crc kubenswrapper[4790]: I0406 12:08:44.018979 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:44 crc kubenswrapper[4790]: E0406 12:08:44.327619 4790 log.go:32] "RunPodSandbox from runtime service failed" err=< Apr 06 12:08:44 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a" Netns:"/var/run/netns/26e32f4d-25e6-4ba6-a930-cc7ffb21faa5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:44 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:44 crc kubenswrapper[4790]: > Apr 06 12:08:44 crc kubenswrapper[4790]: E0406 12:08:44.328020 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Apr 06 12:08:44 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a" Netns:"/var/run/netns/26e32f4d-25e6-4ba6-a930-cc7ffb21faa5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:44 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:44 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:44 crc kubenswrapper[4790]: E0406 12:08:44.328130 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Apr 06 12:08:44 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a" Netns:"/var/run/netns/26e32f4d-25e6-4ba6-a930-cc7ffb21faa5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:44 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:44 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:44 crc kubenswrapper[4790]: E0406 12:08:44.328234 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a\\\" Netns:\\\"/var/run/netns/26e32f4d-25e6-4ba6-a930-cc7ffb21faa5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=1b4a21dceb2e3e881855dfb52ed1b68e18cc6459f6ebec39bfb20f8e09b08e8a;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s\\\": dial tcp 38.102.83.146:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.225173 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.225719 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.226098 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310216 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock\") pod \"ae18ba0c-922e-43b1-8a32-85e498925b20\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310617 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access\") pod \"ae18ba0c-922e-43b1-8a32-85e498925b20\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310613 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock" (OuterVolumeSpecName: "var-lock") pod "ae18ba0c-922e-43b1-8a32-85e498925b20" (UID: "ae18ba0c-922e-43b1-8a32-85e498925b20"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310663 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir\") pod \"ae18ba0c-922e-43b1-8a32-85e498925b20\" (UID: \"ae18ba0c-922e-43b1-8a32-85e498925b20\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ae18ba0c-922e-43b1-8a32-85e498925b20" (UID: "ae18ba0c-922e-43b1-8a32-85e498925b20"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.310998 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.311021 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae18ba0c-922e-43b1-8a32-85e498925b20-kubelet-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.317538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ae18ba0c-922e-43b1-8a32-85e498925b20" (UID: "ae18ba0c-922e-43b1-8a32-85e498925b20"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.413716 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae18ba0c-922e-43b1-8a32-85e498925b20-kube-api-access\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.582404 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.583466 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.584326 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.584635 4790 status_manager.go:851] "Failed to get status for pod" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.584983 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615846 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615823 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") pod \"4e6039c7a12c5a0c0ef5917dc7ee5582\" (UID: \"4e6039c7a12c5a0c0ef5917dc7ee5582\") " Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.615934 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "4e6039c7a12c5a0c0ef5917dc7ee5582" (UID: "4e6039c7a12c5a0c0ef5917dc7ee5582"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.616199 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.616220 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.616228 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4e6039c7a12c5a0c0ef5917dc7ee5582-cert-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:08:45 crc kubenswrapper[4790]: I0406 12:08:45.686938 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" path="/var/lib/kubelet/pods/4e6039c7a12c5a0c0ef5917dc7ee5582/volumes" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.034356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-11-crc" event={"ID":"ae18ba0c-922e-43b1-8a32-85e498925b20","Type":"ContainerDied","Data":"e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db"} Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.034669 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e652b7b86dfca95743e576756eb4d7dc71fe6a09a617c625a12bb887a9c265db" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.034435 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-11-crc" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.039660 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_4e6039c7a12c5a0c0ef5917dc7ee5582/kube-apiserver-cert-syncer/0.log" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.041059 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e6039c7a12c5a0c0ef5917dc7ee5582" containerID="1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d" exitCode=0 Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.041184 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.041207 4790 scope.go:117] "RemoveContainer" containerID="dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.041536 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.042127 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.042987 4790 status_manager.go:851] "Failed to get status for pod" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.043419 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.044544 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.046254 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.046762 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.047315 4790 status_manager.go:851] "Failed to get status for pod" podUID="4e6039c7a12c5a0c0ef5917dc7ee5582" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.064528 4790 scope.go:117] "RemoveContainer" containerID="4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.084374 4790 scope.go:117] "RemoveContainer" containerID="2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.104696 4790 scope.go:117] "RemoveContainer" containerID="47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.134570 4790 scope.go:117] "RemoveContainer" containerID="1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.166341 4790 scope.go:117] "RemoveContainer" containerID="8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.191614 4790 scope.go:117] "RemoveContainer" containerID="dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.192132 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527\": container with ID starting with dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527 not found: ID does not exist" containerID="dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.192191 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527"} err="failed to get container status \"dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527\": rpc error: code = NotFound desc = could not find container \"dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527\": container with ID starting with dd0273b3420682908a04589817853572ca76b405c0bffc2bffef6eb7906e6527 not found: ID does not exist" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.192220 4790 scope.go:117] "RemoveContainer" containerID="4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.193097 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695\": container with ID starting with 4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695 not found: ID does not exist" containerID="4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.193155 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695"} err="failed to get container status \"4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695\": rpc error: code = NotFound desc = could not find container \"4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695\": container with ID starting with 4cec3d13e8b3ebc7becdbcb4b79e0bddba7c26246573098d12e4c96cdbb4e695 not found: ID does not exist" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.193192 4790 scope.go:117] "RemoveContainer" containerID="2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.193534 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46\": container with ID starting with 2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46 not found: ID does not exist" containerID="2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.193654 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46"} err="failed to get container status \"2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46\": rpc error: code = NotFound desc = could not find container \"2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46\": container with ID starting with 2033e0a7c43adde363594c5168fd759013579ed56bcca861cdec532a9b7acf46 not found: ID does not exist" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.193673 4790 scope.go:117] "RemoveContainer" containerID="47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.193993 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07\": container with ID starting with 47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07 not found: ID does not exist" containerID="47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.194049 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07"} err="failed to get container status \"47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07\": rpc error: code = NotFound desc = could not find container \"47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07\": container with ID starting with 47cb04e520560af2626a886e1e08a8850cb01428a1e90ecbfa97476fc01f5a07 not found: ID does not exist" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.194065 4790 scope.go:117] "RemoveContainer" containerID="1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.194401 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d\": container with ID starting with 1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d not found: ID does not exist" containerID="1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.194420 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d"} err="failed to get container status \"1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d\": rpc error: code = NotFound desc = could not find container \"1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d\": container with ID starting with 1386b4a7a4223963a98ff32658dbda33704c080c77a93c86c60859a6e0bf200d not found: ID does not exist" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.194434 4790 scope.go:117] "RemoveContainer" containerID="8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0" Apr 06 12:08:46 crc kubenswrapper[4790]: E0406 12:08:46.194764 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0\": container with ID starting with 8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0 not found: ID does not exist" containerID="8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0" Apr 06 12:08:46 crc kubenswrapper[4790]: I0406 12:08:46.194793 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0"} err="failed to get container status \"8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0\": rpc error: code = NotFound desc = could not find container \"8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0\": container with ID starting with 8354ab4eb427f5f09234d4a718180d4f3284c8916b048b16c810cd384e2ffab0 not found: ID does not exist" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.585711 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.586124 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.586530 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.586962 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.587330 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:47 crc kubenswrapper[4790]: I0406 12:08:47.587381 4790 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.587699 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Apr 06 12:08:47 crc kubenswrapper[4790]: E0406 12:08:47.788731 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Apr 06 12:08:48 crc kubenswrapper[4790]: E0406 12:08:48.189756 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Apr 06 12:08:48 crc kubenswrapper[4790]: E0406 12:08:48.991279 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Apr 06 12:08:50 crc kubenswrapper[4790]: E0406 12:08:50.550291 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18a3c33807ddbf9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:aaec8d0ffd277c0e93001246672220ba,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-04-06 12:08:43.531665309 +0000 UTC m=+702.519408175,LastTimestamp:2026-04-06 12:08:43.531665309 +0000 UTC m=+702.519408175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Apr 06 12:08:50 crc kubenswrapper[4790]: E0406 12:08:50.592512 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Apr 06 12:08:51 crc kubenswrapper[4790]: I0406 12:08:51.678094 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:51 crc kubenswrapper[4790]: I0406 12:08:51.678389 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:53 crc kubenswrapper[4790]: E0406 12:08:53.793766 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="6.4s" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.469008 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T12:08:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T12:08:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T12:08:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-04-06T12:08:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.469402 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.469657 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.469957 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.470197 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:54 crc kubenswrapper[4790]: E0406 12:08:54.470216 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.675376 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.675382 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.676271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.677260 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.677875 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.755121 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.755580 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:55 crc kubenswrapper[4790]: E0406 12:08:55.755940 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:55 crc kubenswrapper[4790]: I0406 12:08:55.756421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:56 crc kubenswrapper[4790]: E0406 12:08:56.061076 4790 log.go:32] "RunPodSandbox from runtime service failed" err=< Apr 06 12:08:56 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5" Netns:"/var/run/netns/fb64e4c3-ec70-4af5-8c20-c528375a6297" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:56 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:56 crc kubenswrapper[4790]: > Apr 06 12:08:56 crc kubenswrapper[4790]: E0406 12:08:56.061668 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Apr 06 12:08:56 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5" Netns:"/var/run/netns/fb64e4c3-ec70-4af5-8c20-c528375a6297" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:56 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:56 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:56 crc kubenswrapper[4790]: E0406 12:08:56.061719 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Apr 06 12:08:56 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5" Netns:"/var/run/netns/fb64e4c3-ec70-4af5-8c20-c528375a6297" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s": dial tcp 38.102.83.146:6443: connect: connection refused Apr 06 12:08:56 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:08:56 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:08:56 crc kubenswrapper[4790]: E0406 12:08:56.061791 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace(56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_openshift-marketplace_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c_0(a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5\\\" Netns:\\\"/var/run/netns/fb64e4c3-ec70-4af5-8c20-c528375a6297\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw;K8S_POD_INFRA_CONTAINER_ID=a8f4d440c71f5adee459e4d4815af81a6b334b8d3b6efa8440c15ba549ae19a5;K8S_POD_UID=56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: SetNetworkStatus: failed to update the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw?timeout=1m0s\\\": dial tcp 38.102.83.146:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.118632 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f04c31653fd2d52d145a959c922a0d3" containerID="de0adc85463b2811796ea4f6d89fc2b9f09a13e4d0e9ca483b07777d4257cbcb" exitCode=0 Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.118677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerDied","Data":"de0adc85463b2811796ea4f6d89fc2b9f09a13e4d0e9ca483b07777d4257cbcb"} Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.118703 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"47c794fa5ec020ea830f1dc987b7a90a1295b1c4719a5e6ed9d1cc23a1cabe58"} Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.118928 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.118939 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:56 crc kubenswrapper[4790]: E0406 12:08:56.119212 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.119251 4790 status_manager.go:851] "Failed to get status for pod" podUID="aaec8d0ffd277c0e93001246672220ba" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:56 crc kubenswrapper[4790]: I0406 12:08:56.119478 4790 status_manager.go:851] "Failed to get status for pod" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" pod="openshift-kube-apiserver/installer-11-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-11-crc\": dial tcp 38.102.83.146:6443: connect: connection refused" Apr 06 12:08:57 crc kubenswrapper[4790]: I0406 12:08:57.130885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"ceaad5b9021cb9b62a3710a5a33051a251bb310fdadcbb0ff03894f43f1a844c"} Apr 06 12:08:57 crc kubenswrapper[4790]: I0406 12:08:57.131543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"91ae64bafc772688362cab69822c251f5563bd83c8d00cc444e128f8bb83a769"} Apr 06 12:08:57 crc kubenswrapper[4790]: I0406 12:08:57.131560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"30fcf7b48d60536743440787a0c551571ea5c420142a3d35dbb52397b6767cfd"} Apr 06 12:08:57 crc kubenswrapper[4790]: I0406 12:08:57.131573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"c8995907f3fa4510a25c29ece96a0603dcb7011f04d068660c52196029cdc9d0"} Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.140695 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_c32a96981201f35bdc64ba062620676a/kube-controller-manager/0.log" Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.140752 4790 generic.go:334] "Generic (PLEG): container finished" podID="c32a96981201f35bdc64ba062620676a" containerID="5adf08c402cd80394738951403b03779b8303ba45572cdac4893ef112f4ffd56" exitCode=1 Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.140814 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerDied","Data":"5adf08c402cd80394738951403b03779b8303ba45572cdac4893ef112f4ffd56"} Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.141358 4790 scope.go:117] "RemoveContainer" containerID="5adf08c402cd80394738951403b03779b8303ba45572cdac4893ef112f4ffd56" Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.146091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"f87355fe4942c61cbf824e03627f74e7c4e65141b531adc3de7bb8fc2dc28749"} Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.146258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.146314 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:58 crc kubenswrapper[4790]: I0406 12:08:58.146332 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:08:59 crc kubenswrapper[4790]: I0406 12:08:59.155539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_c32a96981201f35bdc64ba062620676a/kube-controller-manager/0.log" Apr 06 12:08:59 crc kubenswrapper[4790]: I0406 12:08:59.155887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"c32a96981201f35bdc64ba062620676a","Type":"ContainerStarted","Data":"357b66d03ffd7bcc3b123954e86a7745cf42220b041c9077e51573cd82b8d8ec"} Apr 06 12:09:00 crc kubenswrapper[4790]: I0406 12:09:00.756601 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:00 crc kubenswrapper[4790]: I0406 12:09:00.756882 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:00 crc kubenswrapper[4790]: I0406 12:09:00.762575 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:02 crc kubenswrapper[4790]: I0406 12:09:02.355224 4790 scope.go:117] "RemoveContainer" containerID="642a8714da2ce927c59666c6a55e69c883f9283af786c866dcf1219b7c295bf8" Apr 06 12:09:02 crc kubenswrapper[4790]: I0406 12:09:02.376857 4790 scope.go:117] "RemoveContainer" containerID="efb99e3c83fa8b086e34268cf2ed4f95342b9e349fd28152e399c332bc18f5ff" Apr 06 12:09:02 crc kubenswrapper[4790]: I0406 12:09:02.400491 4790 scope.go:117] "RemoveContainer" containerID="504c30baf03eb53a493e674554abcc5229db2f06840a0877d379361429a5cde2" Apr 06 12:09:02 crc kubenswrapper[4790]: I0406 12:09:02.429195 4790 scope.go:117] "RemoveContainer" containerID="2a0bda650d85589544b66809493d1f094798ba4128af4572a641b923de6a1128" Apr 06 12:09:03 crc kubenswrapper[4790]: I0406 12:09:03.153844 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:03 crc kubenswrapper[4790]: I0406 12:09:03.192118 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:03 crc kubenswrapper[4790]: I0406 12:09:03.192144 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:03 crc kubenswrapper[4790]: I0406 12:09:03.199015 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:03 crc kubenswrapper[4790]: I0406 12:09:03.218618 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3f04c31653fd2d52d145a959c922a0d3" podUID="c25576a8-16f1-49d2-8b94-6e44ef17dab2" Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.199141 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3f04c31653fd2d52d145a959c922a0d3/kube-apiserver-check-endpoints/0.log" Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.201312 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f04c31653fd2d52d145a959c922a0d3" containerID="f87355fe4942c61cbf824e03627f74e7c4e65141b531adc3de7bb8fc2dc28749" exitCode=255 Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.201347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerDied","Data":"f87355fe4942c61cbf824e03627f74e7c4e65141b531adc3de7bb8fc2dc28749"} Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.201599 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.201620 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:04 crc kubenswrapper[4790]: I0406 12:09:04.205191 4790 scope.go:117] "RemoveContainer" containerID="f87355fe4942c61cbf824e03627f74e7c4e65141b531adc3de7bb8fc2dc28749" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.209396 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3f04c31653fd2d52d145a959c922a0d3/kube-apiserver-check-endpoints/0.log" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.211690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3f04c31653fd2d52d145a959c922a0d3","Type":"ContainerStarted","Data":"f1b55e391f764dc362f03e159c593b32ad577bbaa43655b30d953fac9972774d"} Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.212036 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.212077 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.212104 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.744976 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.745265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:09:05 crc kubenswrapper[4790]: I0406 12:09:05.755579 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:09:06 crc kubenswrapper[4790]: I0406 12:09:06.218131 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:06 crc kubenswrapper[4790]: I0406 12:09:06.218160 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8b1f842b-90c7-464a-95e9-795f08463c51" Apr 06 12:09:06 crc kubenswrapper[4790]: I0406 12:09:06.674693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:09:06 crc kubenswrapper[4790]: I0406 12:09:06.676467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:09:06 crc kubenswrapper[4790]: I0406 12:09:06.731543 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Apr 06 12:09:07 crc kubenswrapper[4790]: I0406 12:09:07.225515 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" event={"ID":"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c","Type":"ContainerStarted","Data":"1fcdec64d2d67c2a74fd5cd714a53d2a66fff0be1a802abe656f2de5afe98663"} Apr 06 12:09:08 crc kubenswrapper[4790]: I0406 12:09:08.239995 4790 generic.go:334] "Generic (PLEG): container finished" podID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerID="939f2cb30c61f6a728950bba7dceb73916c11837585037b6d3525e6faa8334bb" exitCode=0 Apr 06 12:09:08 crc kubenswrapper[4790]: I0406 12:09:08.240098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" event={"ID":"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c","Type":"ContainerDied","Data":"939f2cb30c61f6a728950bba7dceb73916c11837585037b6d3525e6faa8334bb"} Apr 06 12:09:10 crc kubenswrapper[4790]: I0406 12:09:10.255048 4790 generic.go:334] "Generic (PLEG): container finished" podID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerID="629cd0ed5f16ada337995bc0d2dff43b942a1a4e3bd2432d919a2b1902d6536f" exitCode=0 Apr 06 12:09:10 crc kubenswrapper[4790]: I0406 12:09:10.255081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" event={"ID":"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c","Type":"ContainerDied","Data":"629cd0ed5f16ada337995bc0d2dff43b942a1a4e3bd2432d919a2b1902d6536f"} Apr 06 12:09:11 crc kubenswrapper[4790]: I0406 12:09:11.262382 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/extract/0.log" Apr 06 12:09:11 crc kubenswrapper[4790]: I0406 12:09:11.263396 4790 generic.go:334] "Generic (PLEG): container finished" podID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerID="08c08d401aa89543437aedc537fe361416495be727dfc7576decfeaf31da02ae" exitCode=1 Apr 06 12:09:11 crc kubenswrapper[4790]: I0406 12:09:11.263439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" event={"ID":"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c","Type":"ContainerDied","Data":"08c08d401aa89543437aedc537fe361416495be727dfc7576decfeaf31da02ae"} Apr 06 12:09:11 crc kubenswrapper[4790]: I0406 12:09:11.689367 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3f04c31653fd2d52d145a959c922a0d3" podUID="c25576a8-16f1-49d2-8b94-6e44ef17dab2" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.496184 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/extract/0.log" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.497254 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.639141 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2w4r\" (UniqueName: \"kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r\") pod \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.639243 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle\") pod \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.639363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util\") pod \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\" (UID: \"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c\") " Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.641954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle" (OuterVolumeSpecName: "bundle") pod "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" (UID: "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.644000 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r" (OuterVolumeSpecName: "kube-api-access-f2w4r") pod "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" (UID: "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c"). InnerVolumeSpecName "kube-api-access-f2w4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.740865 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.740930 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2w4r\" (UniqueName: \"kubernetes.io/projected/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-kube-api-access-f2w4r\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.815444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util" (OuterVolumeSpecName: "util") pod "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" (UID: "56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.842088 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:12 crc kubenswrapper[4790]: I0406 12:09:12.853988 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Apr 06 12:09:13 crc kubenswrapper[4790]: I0406 12:09:13.278093 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/extract/0.log" Apr 06 12:09:13 crc kubenswrapper[4790]: I0406 12:09:13.279414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" event={"ID":"56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c","Type":"ContainerDied","Data":"1fcdec64d2d67c2a74fd5cd714a53d2a66fff0be1a802abe656f2de5afe98663"} Apr 06 12:09:13 crc kubenswrapper[4790]: I0406 12:09:13.279462 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fcdec64d2d67c2a74fd5cd714a53d2a66fff0be1a802abe656f2de5afe98663" Apr 06 12:09:13 crc kubenswrapper[4790]: I0406 12:09:13.279616 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw" Apr 06 12:09:13 crc kubenswrapper[4790]: I0406 12:09:13.789072 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Apr 06 12:09:14 crc kubenswrapper[4790]: I0406 12:09:14.622854 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Apr 06 12:09:14 crc kubenswrapper[4790]: I0406 12:09:14.971570 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Apr 06 12:09:14 crc kubenswrapper[4790]: I0406 12:09:14.992703 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.245954 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.389210 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.578362 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.642820 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.718488 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.751251 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Apr 06 12:09:15 crc kubenswrapper[4790]: I0406 12:09:15.866215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.184120 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.206516 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.244023 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.329726 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.419699 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.509288 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.684606 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.713941 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.733146 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.734002 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.740677 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.823782 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.909563 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.923714 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Apr 06 12:09:16 crc kubenswrapper[4790]: I0406 12:09:16.964104 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.013990 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.125496 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.152620 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.178064 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.283612 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.346816 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.390009 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.417707 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.485355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.518346 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.573297 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.605179 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.622886 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.642047 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.653945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Apr 06 12:09:17 crc kubenswrapper[4790]: I0406 12:09:17.992775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.006303 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.023900 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.026441 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.374406 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.378646 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.533142 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.637545 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.686577 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.761860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.777272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.816118 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.863537 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.908031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Apr 06 12:09:18 crc kubenswrapper[4790]: I0406 12:09:18.973431 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.139223 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.175504 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.179090 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.197909 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.291744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.382344 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.417954 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.475463 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.489106 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.648342 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.677873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.710232 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.746815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.768730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.806946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.810173 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.841353 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Apr 06 12:09:19 crc kubenswrapper[4790]: I0406 12:09:19.964861 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.042148 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.084190 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.186097 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.215304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.381421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.394879 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.461577 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.463212 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.657749 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.748980 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.752690 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.752668214 podStartE2EDuration="37.752668214s" podCreationTimestamp="2026-04-06 12:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:09:03.174431209 +0000 UTC m=+722.162174075" watchObservedRunningTime="2026-04-06 12:09:20.752668214 +0000 UTC m=+739.740411090" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755513 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755560 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v","openshift-kube-apiserver/kube-apiserver-crc"] Apr 06 12:09:20 crc kubenswrapper[4790]: E0406 12:09:20.755770 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" containerName="installer" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755788 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" containerName="installer" Apr 06 12:09:20 crc kubenswrapper[4790]: E0406 12:09:20.755806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="extract" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755814 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="extract" Apr 06 12:09:20 crc kubenswrapper[4790]: E0406 12:09:20.755829 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="util" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755837 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="util" Apr 06 12:09:20 crc kubenswrapper[4790]: E0406 12:09:20.755848 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="pull" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.755873 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="pull" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.756003 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c" containerName="extract" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.756025 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae18ba0c-922e-43b1-8a32-85e498925b20" containerName="installer" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.756959 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw"] Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.757105 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.759984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.761899 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.771910 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.780764 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.784664 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.784647821 podStartE2EDuration="17.784647821s" podCreationTimestamp="2026-04-06 12:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:09:20.778951728 +0000 UTC m=+739.766694614" watchObservedRunningTime="2026-04-06 12:09:20.784647821 +0000 UTC m=+739.772390697" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.788485 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.832280 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.887087 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.909649 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.935009 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.936398 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.941455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wfs\" (UniqueName: \"kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.941504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.941542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.993203 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Apr 06 12:09:20 crc kubenswrapper[4790]: I0406 12:09:20.994700 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.043455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wfs\" (UniqueName: \"kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.043568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.043631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.044373 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.044414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.074558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wfs\" (UniqueName: \"kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs\") pod \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.079870 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.080030 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.164492 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.176496 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.239025 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.254094 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.282492 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.331788 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.349859 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.412364 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.489227 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.513630 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.565162 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.642991 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.658088 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.684044 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.699347 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.702638 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.702754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.728535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.745000 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.934151 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.946789 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Apr 06 12:09:21 crc kubenswrapper[4790]: I0406 12:09:21.958790 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.005053 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.035595 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.035751 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.049399 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.102385 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.197035 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.235913 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.249910 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.271122 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.336380 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.349464 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.434073 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.497169 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.542639 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.569063 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.573225 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.694346 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.835711 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.922822 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Apr 06 12:09:22 crc kubenswrapper[4790]: I0406 12:09:22.926969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.109613 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.193245 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.201889 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.234211 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.269215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.473556 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.517375 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.571400 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.725998 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.787883 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.836549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Apr 06 12:09:23 crc kubenswrapper[4790]: I0406 12:09:23.956241 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.142469 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.197654 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.253120 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.261499 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.308816 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.362001 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.383265 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.401813 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.431642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.468430 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.530148 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.699067 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.730050 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-z7hj5" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.894041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.896103 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.985056 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Apr 06 12:09:24 crc kubenswrapper[4790]: I0406 12:09:24.990915 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.157536 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.170230 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.189095 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.215195 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.243479 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.282680 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.367328 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.371595 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.371972 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" containerID="cri-o://1c7d7cd2d4fac7c93d3413ae3e517975ee62020f57c1c145839b2f5daf2f8ccc" gracePeriod=5 Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.403966 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.439988 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.535273 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.585814 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.729073 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.734406 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.751004 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.827877 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.893175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.918187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Apr 06 12:09:25 crc kubenswrapper[4790]: I0406 12:09:25.964922 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.005143 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.011019 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.054692 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.113635 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.153176 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.179194 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.235373 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.269277 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v"] Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.301207 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.332895 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.384801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.396744 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.397142 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.419890 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.514269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.522683 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.577343 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.580180 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.584467 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.647239 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.655248 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.687062 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: E0406 12:09:26.708100 4790 log.go:32] "RunPodSandbox from runtime service failed" err=< Apr 06 12:09:26 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace_4855f76c-a247-4c86-846c-ad5ecd18c434_0(dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b" Netns:"/var/run/netns/1470572f-eb18-4c68-8b6a-a9e4c3dab09e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v;K8S_POD_INFRA_CONTAINER_ID=dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b;K8S_POD_UID=4855f76c-a247-4c86-846c-ad5ecd18c434" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v/4855f76c-a247-4c86-846c-ad5ecd18c434]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v in out of cluster comm: pod "ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" not found Apr 06 12:09:26 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:09:26 crc kubenswrapper[4790]: > Apr 06 12:09:26 crc kubenswrapper[4790]: E0406 12:09:26.708162 4790 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Apr 06 12:09:26 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace_4855f76c-a247-4c86-846c-ad5ecd18c434_0(dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b" Netns:"/var/run/netns/1470572f-eb18-4c68-8b6a-a9e4c3dab09e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v;K8S_POD_INFRA_CONTAINER_ID=dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b;K8S_POD_UID=4855f76c-a247-4c86-846c-ad5ecd18c434" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v/4855f76c-a247-4c86-846c-ad5ecd18c434]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v in out of cluster comm: pod "ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" not found Apr 06 12:09:26 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:09:26 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:26 crc kubenswrapper[4790]: E0406 12:09:26.708184 4790 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Apr 06 12:09:26 crc kubenswrapper[4790]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace_4855f76c-a247-4c86-846c-ad5ecd18c434_0(dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b" Netns:"/var/run/netns/1470572f-eb18-4c68-8b6a-a9e4c3dab09e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v;K8S_POD_INFRA_CONTAINER_ID=dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b;K8S_POD_UID=4855f76c-a247-4c86-846c-ad5ecd18c434" Path:"" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v/4855f76c-a247-4c86-846c-ad5ecd18c434]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v in out of cluster comm: pod "ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" not found Apr 06 12:09:26 crc kubenswrapper[4790]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Apr 06 12:09:26 crc kubenswrapper[4790]: > pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:26 crc kubenswrapper[4790]: E0406 12:09:26.708243 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace(4855f76c-a247-4c86-846c-ad5ecd18c434)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace(4855f76c-a247-4c86-846c-ad5ecd18c434)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_openshift-marketplace_4855f76c-a247-4c86-846c-ad5ecd18c434_0(dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b): error adding pod openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b\\\" Netns:\\\"/var/run/netns/1470572f-eb18-4c68-8b6a-a9e4c3dab09e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v;K8S_POD_INFRA_CONTAINER_ID=dba4854542ab0f4d972b0e62ce35604d85dc837813ae772aec07fd5fd8a7b51b;K8S_POD_UID=4855f76c-a247-4c86-846c-ad5ecd18c434\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v] networking: Multus: [openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v/4855f76c-a247-4c86-846c-ad5ecd18c434]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v in out of cluster comm: pod \\\"ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.719469 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.726667 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.729471 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.751186 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tqjjd" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.884535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Apr 06 12:09:26 crc kubenswrapper[4790]: I0406 12:09:26.890796 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.147084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.201800 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.257174 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.341405 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.392568 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.393200 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b2dea07-951f-4a31-ae96-5465449fbae8" containerID="7a8df8ef8fdb7a60b04cfb580ca2143c0e08077cf541f0a0a0ff82dea8cbe9cf" exitCode=1 Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.393346 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.393329 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jmqjq" event={"ID":"7b2dea07-951f-4a31-ae96-5465449fbae8","Type":"ContainerDied","Data":"7a8df8ef8fdb7a60b04cfb580ca2143c0e08077cf541f0a0a0ff82dea8cbe9cf"} Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.393944 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.394586 4790 scope.go:117] "RemoveContainer" containerID="7a8df8ef8fdb7a60b04cfb580ca2143c0e08077cf541f0a0a0ff82dea8cbe9cf" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.424130 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.454141 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.532085 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.607542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gjczx" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.638365 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v"] Apr 06 12:09:27 crc kubenswrapper[4790]: W0406 12:09:27.650091 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4855f76c_a247_4c86_846c_ad5ecd18c434.slice/crio-0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028 WatchSource:0}: Error finding container 0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028: Status 404 returned error can't find the container with id 0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028 Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.693932 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.799062 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.815053 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.856992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.862513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.882969 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.928926 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Apr 06 12:09:27 crc kubenswrapper[4790]: I0406 12:09:27.936616 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.008055 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.133418 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.165569 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.231955 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.293691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.400069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.402090 4790 generic.go:334] "Generic (PLEG): container finished" podID="ec02624d-3d5a-423d-818a-1422646a42a9" containerID="cfc2179178df0d70055695864cb18ddabe5a00a0afe65809e20d2bafb1cecd61" exitCode=1 Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.402157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" event={"ID":"ec02624d-3d5a-423d-818a-1422646a42a9","Type":"ContainerDied","Data":"cfc2179178df0d70055695864cb18ddabe5a00a0afe65809e20d2bafb1cecd61"} Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.403273 4790 scope.go:117] "RemoveContainer" containerID="cfc2179178df0d70055695864cb18ddabe5a00a0afe65809e20d2bafb1cecd61" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.406390 4790 generic.go:334] "Generic (PLEG): container finished" podID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerID="d43b87b5c772fd3b506392e396049271a1a5925b4f294491cfb7e8aee07aeabb" exitCode=0 Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.406473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" event={"ID":"4855f76c-a247-4c86-846c-ad5ecd18c434","Type":"ContainerDied","Data":"d43b87b5c772fd3b506392e396049271a1a5925b4f294491cfb7e8aee07aeabb"} Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.406507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" event={"ID":"4855f76c-a247-4c86-846c-ad5ecd18c434","Type":"ContainerStarted","Data":"0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028"} Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.415388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-jmqjq" event={"ID":"7b2dea07-951f-4a31-ae96-5465449fbae8","Type":"ContainerStarted","Data":"88d4ba9067cffdfd5ee6d57dfe2b5cd164b32ff66e68be27fbb2a92b45543079"} Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.460063 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.510765 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.554329 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.574056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.724867 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Apr 06 12:09:28 crc kubenswrapper[4790]: I0406 12:09:28.840985 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.249974 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.282633 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.290747 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.369626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.422194 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n5t6m" event={"ID":"ec02624d-3d5a-423d-818a-1422646a42a9","Type":"ContainerStarted","Data":"4c80f525862d387123816b0a12ef7a12da278258c46acc28d7cf7f8a572f5d25"} Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.426515 4790 generic.go:334] "Generic (PLEG): container finished" podID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerID="68236bc4f53876e55c19324444d282b84e33807542fe80d353daa88c75e89cce" exitCode=0 Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.426549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" event={"ID":"4855f76c-a247-4c86-846c-ad5ecd18c434","Type":"ContainerDied","Data":"68236bc4f53876e55c19324444d282b84e33807542fe80d353daa88c75e89cce"} Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.562660 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.563470 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.587657 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.670445 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Apr 06 12:09:29 crc kubenswrapper[4790]: I0406 12:09:29.751422 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.174966 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.437737 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_aaec8d0ffd277c0e93001246672220ba/startup-monitor/0.log" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.437837 4790 generic.go:334] "Generic (PLEG): container finished" podID="aaec8d0ffd277c0e93001246672220ba" containerID="1c7d7cd2d4fac7c93d3413ae3e517975ee62020f57c1c145839b2f5daf2f8ccc" exitCode=137 Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.440619 4790 generic.go:334] "Generic (PLEG): container finished" podID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerID="055e7f546d568ca4072e6a5df2eea5015d2e54dfdab329a85a45cde4e3880aa0" exitCode=0 Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.440697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" event={"ID":"4855f76c-a247-4c86-846c-ad5ecd18c434","Type":"ContainerDied","Data":"055e7f546d568ca4072e6a5df2eea5015d2e54dfdab329a85a45cde4e3880aa0"} Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.481241 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.485081 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.622691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.946246 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_aaec8d0ffd277c0e93001246672220ba/startup-monitor/0.log" Apr 06 12:09:30 crc kubenswrapper[4790]: I0406 12:09:30.946336 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock" (OuterVolumeSpecName: "var-lock") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100345 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100374 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log" (OuterVolumeSpecName: "var-log") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100533 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100583 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") pod \"aaec8d0ffd277c0e93001246672220ba\" (UID: \"aaec8d0ffd277c0e93001246672220ba\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.100812 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests" (OuterVolumeSpecName: "manifests") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.101208 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.101245 4790 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-manifests\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.101270 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-lock\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.101289 4790 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-var-log\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.113315 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "aaec8d0ffd277c0e93001246672220ba" (UID: "aaec8d0ffd277c0e93001246672220ba"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.160386 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.203587 4790 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/aaec8d0ffd277c0e93001246672220ba-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.446158 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.481703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_aaec8d0ffd277c0e93001246672220ba/startup-monitor/0.log" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.482983 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.483931 4790 scope.go:117] "RemoveContainer" containerID="1c7d7cd2d4fac7c93d3413ae3e517975ee62020f57c1c145839b2f5daf2f8ccc" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.537730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.703234 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaec8d0ffd277c0e93001246672220ba" path="/var/lib/kubelet/pods/aaec8d0ffd277c0e93001246672220ba/volumes" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.703482 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.714809 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.714980 4790 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d3e6c557-f81d-4b84-b09d-0468a7f71f0f" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.719031 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.719060 4790 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d3e6c557-f81d-4b84-b09d-0468a7f71f0f" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.758541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.771031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.931679 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle\") pod \"4855f76c-a247-4c86-846c-ad5ecd18c434\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.931738 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util\") pod \"4855f76c-a247-4c86-846c-ad5ecd18c434\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.931816 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9wfs\" (UniqueName: \"kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs\") pod \"4855f76c-a247-4c86-846c-ad5ecd18c434\" (UID: \"4855f76c-a247-4c86-846c-ad5ecd18c434\") " Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.935327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs" (OuterVolumeSpecName: "kube-api-access-c9wfs") pod "4855f76c-a247-4c86-846c-ad5ecd18c434" (UID: "4855f76c-a247-4c86-846c-ad5ecd18c434"). InnerVolumeSpecName "kube-api-access-c9wfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.937487 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle" (OuterVolumeSpecName: "bundle") pod "4855f76c-a247-4c86-846c-ad5ecd18c434" (UID: "4855f76c-a247-4c86-846c-ad5ecd18c434"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:09:31 crc kubenswrapper[4790]: I0406 12:09:31.946543 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util" (OuterVolumeSpecName: "util") pod "4855f76c-a247-4c86-846c-ad5ecd18c434" (UID: "4855f76c-a247-4c86-846c-ad5ecd18c434"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.033089 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9wfs\" (UniqueName: \"kubernetes.io/projected/4855f76c-a247-4c86-846c-ad5ecd18c434-kube-api-access-c9wfs\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.033125 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.033138 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4855f76c-a247-4c86-846c-ad5ecd18c434-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.495660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" event={"ID":"4855f76c-a247-4c86-846c-ad5ecd18c434","Type":"ContainerDied","Data":"0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028"} Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.496073 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0206fcf844de499248e1f823d5fff3df84052fc37ce213b240ea0c094e321028" Apr 06 12:09:32 crc kubenswrapper[4790]: I0406 12:09:32.495736 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v" Apr 06 12:09:33 crc kubenswrapper[4790]: I0406 12:09:33.117445 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Apr 06 12:09:38 crc kubenswrapper[4790]: I0406 12:09:38.591316 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591282-x9mjb"] Apr 06 12:09:38 crc kubenswrapper[4790]: I0406 12:09:38.595007 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591282-x9mjb"] Apr 06 12:09:39 crc kubenswrapper[4790]: I0406 12:09:39.684569 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799ced2c-8ccc-4ec8-85c2-c92617b654e8" path="/var/lib/kubelet/pods/799ced2c-8ccc-4ec8-85c2-c92617b654e8/volumes" Apr 06 12:09:59 crc kubenswrapper[4790]: I0406 12:09:59.488528 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 12:09:59 crc kubenswrapper[4790]: I0406 12:09:59.489369 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" containerID="cri-o://4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61" gracePeriod=120 Apr 06 12:09:59 crc kubenswrapper[4790]: I0406 12:09:59.489424 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver-check-endpoints" containerID="cri-o://13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24" gracePeriod=120 Apr 06 12:09:59 crc kubenswrapper[4790]: I0406 12:09:59.694566 4790 generic.go:334] "Generic (PLEG): container finished" podID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerID="13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24" exitCode=0 Apr 06 12:09:59 crc kubenswrapper[4790]: I0406 12:09:59.694673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerDied","Data":"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24"} Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.138766 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591290-mqqrk"] Apr 06 12:10:00 crc kubenswrapper[4790]: E0406 12:10:00.139235 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139277 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 06 12:10:00 crc kubenswrapper[4790]: E0406 12:10:00.139293 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="pull" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139300 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="pull" Apr 06 12:10:00 crc kubenswrapper[4790]: E0406 12:10:00.139309 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="extract" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139316 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="extract" Apr 06 12:10:00 crc kubenswrapper[4790]: E0406 12:10:00.139328 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="util" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139333 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="util" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139425 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4855f76c-a247-4c86-846c-ad5ecd18c434" containerName="extract" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139436 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaec8d0ffd277c0e93001246672220ba" containerName="startup-monitor" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.139859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.143391 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.143411 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.143582 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.144579 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591290-mqqrk"] Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.216724 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6k9\" (UniqueName: \"kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9\") pod \"auto-csr-approver-29591290-mqqrk\" (UID: \"84ecda30-93ff-4f79-b211-a1e22749a64f\") " pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.318410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6k9\" (UniqueName: \"kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9\") pod \"auto-csr-approver-29591290-mqqrk\" (UID: \"84ecda30-93ff-4f79-b211-a1e22749a64f\") " pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.337401 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6k9\" (UniqueName: \"kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9\") pod \"auto-csr-approver-29591290-mqqrk\" (UID: \"84ecda30-93ff-4f79-b211-a1e22749a64f\") " pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.451973 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.640030 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591290-mqqrk"] Apr 06 12:10:00 crc kubenswrapper[4790]: I0406 12:10:00.700878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" event={"ID":"84ecda30-93ff-4f79-b211-a1e22749a64f","Type":"ContainerStarted","Data":"f901228b884d8c0a727e3a09c62b3754f1aec01b36fbe7e5ce29bd1145ced765"} Apr 06 12:10:02 crc kubenswrapper[4790]: I0406 12:10:02.513863 4790 scope.go:117] "RemoveContainer" containerID="7b680c55efccb036ee3a6bcfae81939fe913ce7c9bc8fbb565f257d0d8b401da" Apr 06 12:10:02 crc kubenswrapper[4790]: I0406 12:10:02.541995 4790 scope.go:117] "RemoveContainer" containerID="b89b77d040bf23625e045615f350e25ce436be016e21d565b40d53681aa9bc46" Apr 06 12:10:02 crc kubenswrapper[4790]: I0406 12:10:02.715991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" event={"ID":"84ecda30-93ff-4f79-b211-a1e22749a64f","Type":"ContainerDied","Data":"fb1881b537ea1c340d1ef7a2c8bcfbf79af6e760ad216f4829c53d80dcf9f0ce"} Apr 06 12:10:02 crc kubenswrapper[4790]: I0406 12:10:02.716402 4790 generic.go:334] "Generic (PLEG): container finished" podID="84ecda30-93ff-4f79-b211-a1e22749a64f" containerID="fb1881b537ea1c340d1ef7a2c8bcfbf79af6e760ad216f4829c53d80dcf9f0ce" exitCode=0 Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.034114 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.167785 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6k9\" (UniqueName: \"kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9\") pod \"84ecda30-93ff-4f79-b211-a1e22749a64f\" (UID: \"84ecda30-93ff-4f79-b211-a1e22749a64f\") " Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.179150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9" (OuterVolumeSpecName: "kube-api-access-pw6k9") pod "84ecda30-93ff-4f79-b211-a1e22749a64f" (UID: "84ecda30-93ff-4f79-b211-a1e22749a64f"). InnerVolumeSpecName "kube-api-access-pw6k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.270434 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6k9\" (UniqueName: \"kubernetes.io/projected/84ecda30-93ff-4f79-b211-a1e22749a64f-kube-api-access-pw6k9\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.281865 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:04 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:04 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:04 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.281961 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.732308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" event={"ID":"84ecda30-93ff-4f79-b211-a1e22749a64f","Type":"ContainerDied","Data":"f901228b884d8c0a727e3a09c62b3754f1aec01b36fbe7e5ce29bd1145ced765"} Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.732348 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f901228b884d8c0a727e3a09c62b3754f1aec01b36fbe7e5ce29bd1145ced765" Apr 06 12:10:04 crc kubenswrapper[4790]: I0406 12:10:04.732347 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591290-mqqrk" Apr 06 12:10:05 crc kubenswrapper[4790]: I0406 12:10:05.090176 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591284-5wlfg"] Apr 06 12:10:05 crc kubenswrapper[4790]: I0406 12:10:05.094388 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591284-5wlfg"] Apr 06 12:10:05 crc kubenswrapper[4790]: I0406 12:10:05.687440 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5500e65f-18d1-4c5d-bf4e-9e3d55d85aca" path="/var/lib/kubelet/pods/5500e65f-18d1-4c5d-bf4e-9e3d55d85aca/volumes" Apr 06 12:10:09 crc kubenswrapper[4790]: I0406 12:10:09.273987 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:09 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:09 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:09 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:09 crc kubenswrapper[4790]: I0406 12:10:09.274358 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.562305 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:10:10 crc kubenswrapper[4790]: E0406 12:10:10.563646 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ecda30-93ff-4f79-b211-a1e22749a64f" containerName="oc" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.563779 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ecda30-93ff-4f79-b211-a1e22749a64f" containerName="oc" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.564951 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ecda30-93ff-4f79-b211-a1e22749a64f" containerName="oc" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.566254 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.571428 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.753900 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.755181 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.762309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988gl\" (UniqueName: \"kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.762411 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.762482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.765642 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.864800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.864896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988gl\" (UniqueName: \"kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.864951 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.864982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffv2\" (UniqueName: \"kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.865020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.865053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.865344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.865613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.891368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988gl\" (UniqueName: \"kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl\") pod \"community-operators-rhjnx\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.966644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.966878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffv2\" (UniqueName: \"kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.966917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.967351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.967424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:10 crc kubenswrapper[4790]: I0406 12:10:10.983479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffv2\" (UniqueName: \"kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2\") pod \"certified-operators-6jr6p\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.139179 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.185877 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.428790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.497562 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:10:11 crc kubenswrapper[4790]: W0406 12:10:11.508649 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3707401_54b1_47be_9b15_87b46677513b.slice/crio-f0daeab29f1e8eb97e3ac45c46968095795fefb7ce7b054273a150fd66af355b WatchSource:0}: Error finding container f0daeab29f1e8eb97e3ac45c46968095795fefb7ce7b054273a150fd66af355b: Status 404 returned error can't find the container with id f0daeab29f1e8eb97e3ac45c46968095795fefb7ce7b054273a150fd66af355b Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.805789 4790 generic.go:334] "Generic (PLEG): container finished" podID="4efd618c-f882-4443-a843-cc58798d24c6" containerID="a30dd512339409121cb4760f44ad703caa01260ec380f9b29b4034e3cc0f5e5e" exitCode=0 Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.805863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerDied","Data":"a30dd512339409121cb4760f44ad703caa01260ec380f9b29b4034e3cc0f5e5e"} Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.805889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerStarted","Data":"001c83959fc96953231bccb66cc982e521ee18749b478e1556b58039834bca79"} Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.807509 4790 generic.go:334] "Generic (PLEG): container finished" podID="c3707401-54b1-47be-9b15-87b46677513b" containerID="fab6cd1c88d0a2e5bba4078a9eb43dcbb85be65923975800b4c3b3b190ae5696" exitCode=0 Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.807540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerDied","Data":"fab6cd1c88d0a2e5bba4078a9eb43dcbb85be65923975800b4c3b3b190ae5696"} Apr 06 12:10:11 crc kubenswrapper[4790]: I0406 12:10:11.807557 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerStarted","Data":"f0daeab29f1e8eb97e3ac45c46968095795fefb7ce7b054273a150fd66af355b"} Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.816283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerStarted","Data":"681f817bfcee797dbfc0f0e70c95cc39f8cc408739b7cc846cd0fcaad38fecd0"} Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.819493 4790 generic.go:334] "Generic (PLEG): container finished" podID="c3707401-54b1-47be-9b15-87b46677513b" containerID="aa8e309426fc38558f3e045326e3118615c10de3164040c1c9c0ee7ea1732f38" exitCode=0 Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.819533 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerDied","Data":"aa8e309426fc38558f3e045326e3118615c10de3164040c1c9c0ee7ea1732f38"} Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.965633 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.969106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:12 crc kubenswrapper[4790]: I0406 12:10:12.977633 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.104760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mdx\" (UniqueName: \"kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.104911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.104970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.155414 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.156664 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.179344 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9vc\" (UniqueName: \"kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mdx\" (UniqueName: \"kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207180 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207238 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.207948 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.208081 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.231680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mdx\" (UniqueName: \"kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx\") pod \"redhat-marketplace-n7gs7\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.287287 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.309116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9vc\" (UniqueName: \"kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.309212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.309280 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.309950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.310567 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.325530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9vc\" (UniqueName: \"kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc\") pod \"redhat-operators-mqhc6\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.475896 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.493546 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:13 crc kubenswrapper[4790]: W0406 12:10:13.507923 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2147c3d0_ea51_4143_a242_5cded03e7aa8.slice/crio-a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05 WatchSource:0}: Error finding container a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05: Status 404 returned error can't find the container with id a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05 Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.830285 4790 generic.go:334] "Generic (PLEG): container finished" podID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerID="36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5" exitCode=0 Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.830344 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerDied","Data":"36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5"} Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.830574 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerStarted","Data":"a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05"} Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.835421 4790 generic.go:334] "Generic (PLEG): container finished" podID="4efd618c-f882-4443-a843-cc58798d24c6" containerID="681f817bfcee797dbfc0f0e70c95cc39f8cc408739b7cc846cd0fcaad38fecd0" exitCode=0 Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.835488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerDied","Data":"681f817bfcee797dbfc0f0e70c95cc39f8cc408739b7cc846cd0fcaad38fecd0"} Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.838816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerStarted","Data":"4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0"} Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.858950 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rhjnx" podStartSLOduration=2.410294767 podStartE2EDuration="3.858933742s" podCreationTimestamp="2026-04-06 12:10:10 +0000 UTC" firstStartedPulling="2026-04-06 12:10:11.809494821 +0000 UTC m=+790.797237687" lastFinishedPulling="2026-04-06 12:10:13.258133756 +0000 UTC m=+792.245876662" observedRunningTime="2026-04-06 12:10:13.857528204 +0000 UTC m=+792.845271070" watchObservedRunningTime="2026-04-06 12:10:13.858933742 +0000 UTC m=+792.846676608" Apr 06 12:10:13 crc kubenswrapper[4790]: I0406 12:10:13.877692 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:10:13 crc kubenswrapper[4790]: W0406 12:10:13.881648 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c342bf5_bb1e_43e0_a549_e73112fa4e4a.slice/crio-e9f2cf9e348cdd9f16e9e5a7839a90595e829a1fb1d8f3834c69b02fecd06e48 WatchSource:0}: Error finding container e9f2cf9e348cdd9f16e9e5a7839a90595e829a1fb1d8f3834c69b02fecd06e48: Status 404 returned error can't find the container with id e9f2cf9e348cdd9f16e9e5a7839a90595e829a1fb1d8f3834c69b02fecd06e48 Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.271087 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:14 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:14 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:14 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.271158 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.271265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.845442 4790 generic.go:334] "Generic (PLEG): container finished" podID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerID="5039e752e9e54f5677c91b0a5014118c1596eb393a678217f8db86f133c6ba8e" exitCode=0 Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.845526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerDied","Data":"5039e752e9e54f5677c91b0a5014118c1596eb393a678217f8db86f133c6ba8e"} Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.847863 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerID="d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71" exitCode=0 Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.847951 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerDied","Data":"d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71"} Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.848133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerStarted","Data":"e9f2cf9e348cdd9f16e9e5a7839a90595e829a1fb1d8f3834c69b02fecd06e48"} Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.851658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerStarted","Data":"3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319"} Apr 06 12:10:14 crc kubenswrapper[4790]: I0406 12:10:14.900891 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6jr6p" podStartSLOduration=2.525021617 podStartE2EDuration="4.900873511s" podCreationTimestamp="2026-04-06 12:10:10 +0000 UTC" firstStartedPulling="2026-04-06 12:10:11.807165529 +0000 UTC m=+790.794908395" lastFinishedPulling="2026-04-06 12:10:14.183017423 +0000 UTC m=+793.170760289" observedRunningTime="2026-04-06 12:10:14.899377061 +0000 UTC m=+793.887119947" watchObservedRunningTime="2026-04-06 12:10:14.900873511 +0000 UTC m=+793.888616377" Apr 06 12:10:15 crc kubenswrapper[4790]: I0406 12:10:15.859752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerStarted","Data":"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133"} Apr 06 12:10:15 crc kubenswrapper[4790]: I0406 12:10:15.862216 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerStarted","Data":"c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e"} Apr 06 12:10:15 crc kubenswrapper[4790]: I0406 12:10:15.899123 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7gs7" podStartSLOduration=2.508746146 podStartE2EDuration="3.899101632s" podCreationTimestamp="2026-04-06 12:10:12 +0000 UTC" firstStartedPulling="2026-04-06 12:10:13.831546464 +0000 UTC m=+792.819289330" lastFinishedPulling="2026-04-06 12:10:15.22190195 +0000 UTC m=+794.209644816" observedRunningTime="2026-04-06 12:10:15.895604418 +0000 UTC m=+794.883347294" watchObservedRunningTime="2026-04-06 12:10:15.899101632 +0000 UTC m=+794.886844498" Apr 06 12:10:16 crc kubenswrapper[4790]: I0406 12:10:16.879032 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerID="806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133" exitCode=0 Apr 06 12:10:16 crc kubenswrapper[4790]: I0406 12:10:16.879095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerDied","Data":"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133"} Apr 06 12:10:17 crc kubenswrapper[4790]: I0406 12:10:17.888155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerStarted","Data":"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8"} Apr 06 12:10:17 crc kubenswrapper[4790]: I0406 12:10:17.904136 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mqhc6" podStartSLOduration=2.404869909 podStartE2EDuration="4.904117397s" podCreationTimestamp="2026-04-06 12:10:13 +0000 UTC" firstStartedPulling="2026-04-06 12:10:14.849988351 +0000 UTC m=+793.837731217" lastFinishedPulling="2026-04-06 12:10:17.349235839 +0000 UTC m=+796.336978705" observedRunningTime="2026-04-06 12:10:17.903959593 +0000 UTC m=+796.891702479" watchObservedRunningTime="2026-04-06 12:10:17.904117397 +0000 UTC m=+796.891860263" Apr 06 12:10:19 crc kubenswrapper[4790]: I0406 12:10:19.271692 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:19 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:19 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:19 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:19 crc kubenswrapper[4790]: I0406 12:10:19.272200 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.140018 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.140120 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.185984 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.186062 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.198960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.235404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.961903 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:21 crc kubenswrapper[4790]: I0406 12:10:21.964680 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.287630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.287975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.323879 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.476786 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.476866 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:23 crc kubenswrapper[4790]: I0406 12:10:23.977504 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:24 crc kubenswrapper[4790]: I0406 12:10:24.272517 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:24 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:24 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:24 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:24 crc kubenswrapper[4790]: I0406 12:10:24.272599 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:24 crc kubenswrapper[4790]: I0406 12:10:24.516436 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqhc6" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:24 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:24 crc kubenswrapper[4790]: > Apr 06 12:10:29 crc kubenswrapper[4790]: I0406 12:10:29.273524 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:29 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:29 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:29 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:29 crc kubenswrapper[4790]: I0406 12:10:29.274036 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:33 crc kubenswrapper[4790]: I0406 12:10:33.543233 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:33 crc kubenswrapper[4790]: I0406 12:10:33.616033 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:10:34 crc kubenswrapper[4790]: I0406 12:10:34.270796 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:34 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:34 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:34 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:34 crc kubenswrapper[4790]: I0406 12:10:34.270870 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.638156 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.639456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.650780 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.807791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.807873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54d8l\" (UniqueName: \"kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.807890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.909172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.909235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54d8l\" (UniqueName: \"kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.909257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.909717 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.909861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.930895 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54d8l\" (UniqueName: \"kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l\") pod \"redhat-operators-6rz5v\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:35 crc kubenswrapper[4790]: I0406 12:10:35.954878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.196064 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.197339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.209689 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.314980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.315034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.315063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65z7\" (UniqueName: \"kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.358534 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:10:36 crc kubenswrapper[4790]: W0406 12:10:36.360032 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7781c53a_5217_44ce_9a47_671e951b9c7e.slice/crio-6e834f6e7fe6fa4bd024b823ef687544495503871fd60b82b378452777d33172 WatchSource:0}: Error finding container 6e834f6e7fe6fa4bd024b823ef687544495503871fd60b82b378452777d33172: Status 404 returned error can't find the container with id 6e834f6e7fe6fa4bd024b823ef687544495503871fd60b82b378452777d33172 Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.416071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.416132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.416162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65z7\" (UniqueName: \"kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.416560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.416574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.432716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65z7\" (UniqueName: \"kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7\") pod \"redhat-operators-xwlmr\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.511956 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:36 crc kubenswrapper[4790]: I0406 12:10:36.908965 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:10:36 crc kubenswrapper[4790]: W0406 12:10:36.916689 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c18cf73_84b8_4a06_b942_4ec437124f79.slice/crio-0ab86dba8b7627f743c582362efdb3de48564272b5bc588bcb3c966b33396005 WatchSource:0}: Error finding container 0ab86dba8b7627f743c582362efdb3de48564272b5bc588bcb3c966b33396005: Status 404 returned error can't find the container with id 0ab86dba8b7627f743c582362efdb3de48564272b5bc588bcb3c966b33396005 Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.021113 4790 generic.go:334] "Generic (PLEG): container finished" podID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerID="9242efd38b17b077246c444c8d03b383c69900f340afad74fa8b003c643e3ba6" exitCode=0 Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.021177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerDied","Data":"9242efd38b17b077246c444c8d03b383c69900f340afad74fa8b003c643e3ba6"} Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.021204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerStarted","Data":"6e834f6e7fe6fa4bd024b823ef687544495503871fd60b82b378452777d33172"} Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.023397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerStarted","Data":"0ab86dba8b7627f743c582362efdb3de48564272b5bc588bcb3c966b33396005"} Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.398859 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.400753 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.412184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.531987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6664\" (UniqueName: \"kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.532108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.532147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.633273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.633556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.633598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6664\" (UniqueName: \"kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.633967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.634040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.657074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6664\" (UniqueName: \"kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664\") pod \"redhat-operators-xmsvt\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:37 crc kubenswrapper[4790]: I0406 12:10:37.728139 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.030369 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerStarted","Data":"303701940f565d768185a2992e94f1917d698b3d496f6d3cdf493e0bdea36c0b"} Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.032895 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerID="dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52" exitCode=0 Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.032937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerDied","Data":"dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52"} Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.161455 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:10:38 crc kubenswrapper[4790]: W0406 12:10:38.173890 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff0634b_ac6b_4c16_a72f_9bf81b54b1af.slice/crio-0f33157daa5b1e3311a4a2c0d397d9dc90a1d6e5e690a44245fef040eb219d63 WatchSource:0}: Error finding container 0f33157daa5b1e3311a4a2c0d397d9dc90a1d6e5e690a44245fef040eb219d63: Status 404 returned error can't find the container with id 0f33157daa5b1e3311a4a2c0d397d9dc90a1d6e5e690a44245fef040eb219d63 Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.594600 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.597253 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.607502 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.756491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.756527 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpwrt\" (UniqueName: \"kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.756573 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.858028 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.858088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpwrt\" (UniqueName: \"kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.858159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.858708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.859452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.876994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpwrt\" (UniqueName: \"kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt\") pod \"redhat-operators-f4jjg\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:38 crc kubenswrapper[4790]: I0406 12:10:38.915785 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.063786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerStarted","Data":"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765"} Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.067001 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerID="d0bd7e699ebc34948bccd2003a638d96e1f8fffee6b4dc8a8637914b2a0aa227" exitCode=0 Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.067055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerDied","Data":"d0bd7e699ebc34948bccd2003a638d96e1f8fffee6b4dc8a8637914b2a0aa227"} Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.067074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerStarted","Data":"0f33157daa5b1e3311a4a2c0d397d9dc90a1d6e5e690a44245fef040eb219d63"} Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.072747 4790 generic.go:334] "Generic (PLEG): container finished" podID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerID="303701940f565d768185a2992e94f1917d698b3d496f6d3cdf493e0bdea36c0b" exitCode=0 Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.072805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerDied","Data":"303701940f565d768185a2992e94f1917d698b3d496f6d3cdf493e0bdea36c0b"} Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.271731 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:39 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:39 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:39 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.271780 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.368311 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:10:39 crc kubenswrapper[4790]: W0406 12:10:39.373141 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1fd968f_f454_4241_afa5_e47623a01c84.slice/crio-9e00577f57f18d7b2285859341cb8b8383696f77832db39714919a94c6e34e32 WatchSource:0}: Error finding container 9e00577f57f18d7b2285859341cb8b8383696f77832db39714919a94c6e34e32: Status 404 returned error can't find the container with id 9e00577f57f18d7b2285859341cb8b8383696f77832db39714919a94c6e34e32 Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.753495 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.753817 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.800969 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.820485 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.832422 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.971036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.971107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vl28\" (UniqueName: \"kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:39 crc kubenswrapper[4790]: I0406 12:10:39.971252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.072630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vl28\" (UniqueName: \"kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.072753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.072808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.073362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.073407 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.081353 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerID="931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765" exitCode=0 Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.081436 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerDied","Data":"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765"} Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.084015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerStarted","Data":"7f7bd39f243316c503a0c53221edb8b168b6e99f503d937be1400c0cdacac95e"} Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.086623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerStarted","Data":"7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1"} Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.089893 4790 generic.go:334] "Generic (PLEG): container finished" podID="a1fd968f-f454-4241-afa5-e47623a01c84" containerID="cfdd6404e53527425721b0bab9e3aa4b2d876ea8b8d45fbf9775f85791c8660f" exitCode=0 Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.089940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerDied","Data":"cfdd6404e53527425721b0bab9e3aa4b2d876ea8b8d45fbf9775f85791c8660f"} Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.089964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerStarted","Data":"9e00577f57f18d7b2285859341cb8b8383696f77832db39714919a94c6e34e32"} Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.102756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vl28\" (UniqueName: \"kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28\") pod \"redhat-operators-wgd2w\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.126117 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6rz5v" podStartSLOduration=2.677568004 podStartE2EDuration="5.126101316s" podCreationTimestamp="2026-04-06 12:10:35 +0000 UTC" firstStartedPulling="2026-04-06 12:10:37.022808395 +0000 UTC m=+816.010551271" lastFinishedPulling="2026-04-06 12:10:39.471341717 +0000 UTC m=+818.459084583" observedRunningTime="2026-04-06 12:10:40.125210112 +0000 UTC m=+819.112952978" watchObservedRunningTime="2026-04-06 12:10:40.126101316 +0000 UTC m=+819.113844182" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.161986 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:40 crc kubenswrapper[4790]: I0406 12:10:40.395476 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.096604 4790 generic.go:334] "Generic (PLEG): container finished" podID="42617262-5255-4f88-8afa-9d93a640090e" containerID="f720b7549895e95d1e70544146d1972bcf0008de33562c766ca0656c1a1ce270" exitCode=0 Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.096673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerDied","Data":"f720b7549895e95d1e70544146d1972bcf0008de33562c766ca0656c1a1ce270"} Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.096961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerStarted","Data":"52455297e83a5f3775ab0b0e267496bcee17a7de69fb9a36004bb4abbb4108f2"} Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.102092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerStarted","Data":"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14"} Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.104074 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerID="7f7bd39f243316c503a0c53221edb8b168b6e99f503d937be1400c0cdacac95e" exitCode=0 Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.104719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerDied","Data":"7f7bd39f243316c503a0c53221edb8b168b6e99f503d937be1400c0cdacac95e"} Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.142641 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwlmr" podStartSLOduration=2.673990506 podStartE2EDuration="5.14262221s" podCreationTimestamp="2026-04-06 12:10:36 +0000 UTC" firstStartedPulling="2026-04-06 12:10:38.03455625 +0000 UTC m=+817.022299116" lastFinishedPulling="2026-04-06 12:10:40.503187954 +0000 UTC m=+819.490930820" observedRunningTime="2026-04-06 12:10:41.13446876 +0000 UTC m=+820.122211666" watchObservedRunningTime="2026-04-06 12:10:41.14262221 +0000 UTC m=+820.130365086" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.600131 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.601516 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.613256 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.695416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.695660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6xv\" (UniqueName: \"kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.695777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.801434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.801499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6xv\" (UniqueName: \"kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.801535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.801950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.802290 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.832662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6xv\" (UniqueName: \"kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv\") pod \"redhat-operators-fxrr2\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:41 crc kubenswrapper[4790]: I0406 12:10:41.930242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.126163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerStarted","Data":"aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941"} Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.133962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerStarted","Data":"dd6d9624d08cf9e4b71e379501eb980d6b5eae20b3ab1465244989e466c7387f"} Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.136717 4790 generic.go:334] "Generic (PLEG): container finished" podID="a1fd968f-f454-4241-afa5-e47623a01c84" containerID="8b422088fa7a70fbb8cfff5df3f5c0d80243ce4c3e0ce85d9048f0496a571d75" exitCode=0 Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.137613 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerDied","Data":"8b422088fa7a70fbb8cfff5df3f5c0d80243ce4c3e0ce85d9048f0496a571d75"} Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.147319 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmsvt" podStartSLOduration=2.763126367 podStartE2EDuration="5.147304175s" podCreationTimestamp="2026-04-06 12:10:37 +0000 UTC" firstStartedPulling="2026-04-06 12:10:39.068300919 +0000 UTC m=+818.056043785" lastFinishedPulling="2026-04-06 12:10:41.452478727 +0000 UTC m=+820.440221593" observedRunningTime="2026-04-06 12:10:42.145145367 +0000 UTC m=+821.132888233" watchObservedRunningTime="2026-04-06 12:10:42.147304175 +0000 UTC m=+821.135047041" Apr 06 12:10:42 crc kubenswrapper[4790]: I0406 12:10:42.198821 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.145739 4790 generic.go:334] "Generic (PLEG): container finished" podID="42617262-5255-4f88-8afa-9d93a640090e" containerID="dd6d9624d08cf9e4b71e379501eb980d6b5eae20b3ab1465244989e466c7387f" exitCode=0 Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.145846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerDied","Data":"dd6d9624d08cf9e4b71e379501eb980d6b5eae20b3ab1465244989e466c7387f"} Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.150246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerStarted","Data":"ed76fa8292a4bfd378bd64b7ab5e25a405d3a61ea7279296bc6831f0cd8d8f85"} Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.153057 4790 generic.go:334] "Generic (PLEG): container finished" podID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerID="c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799" exitCode=0 Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.153131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerDied","Data":"c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799"} Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.153188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerStarted","Data":"66e39c64c1fbdecd39f940a17ea09357c6739fee18a0b6387fff9356ff3a2e56"} Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.224856 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f4jjg" podStartSLOduration=2.33968246 podStartE2EDuration="5.224839924s" podCreationTimestamp="2026-04-06 12:10:38 +0000 UTC" firstStartedPulling="2026-04-06 12:10:40.090960839 +0000 UTC m=+819.078703705" lastFinishedPulling="2026-04-06 12:10:42.976118303 +0000 UTC m=+821.963861169" observedRunningTime="2026-04-06 12:10:43.223434166 +0000 UTC m=+822.211177032" watchObservedRunningTime="2026-04-06 12:10:43.224839924 +0000 UTC m=+822.212582790" Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.798775 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.800976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.815472 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.934014 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsv4r\" (UniqueName: \"kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.934305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:43 crc kubenswrapper[4790]: I0406 12:10:43.934492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.035633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsv4r\" (UniqueName: \"kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.036083 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.036245 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.036767 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.037223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.063724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsv4r\" (UniqueName: \"kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r\") pod \"community-operators-kg8bf\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.119085 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.164472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerStarted","Data":"f39af18abc79914046ebf247535ef1eca86e1f9c512f4a348b434ae9a5d6995c"} Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.192360 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgd2w" podStartSLOduration=2.750950598 podStartE2EDuration="5.192339818s" podCreationTimestamp="2026-04-06 12:10:39 +0000 UTC" firstStartedPulling="2026-04-06 12:10:41.098114451 +0000 UTC m=+820.085857317" lastFinishedPulling="2026-04-06 12:10:43.539503671 +0000 UTC m=+822.527246537" observedRunningTime="2026-04-06 12:10:44.188674229 +0000 UTC m=+823.176417115" watchObservedRunningTime="2026-04-06 12:10:44.192339818 +0000 UTC m=+823.180082684" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.278077 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:44 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:44 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:44 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.278455 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:44 crc kubenswrapper[4790]: I0406 12:10:44.633773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:10:44 crc kubenswrapper[4790]: W0406 12:10:44.641743 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9985d3_5261_42f1_9db8_30fd5b560d36.slice/crio-277b6635b4130ba8e73d546bc4ca84d0920a25205b6109d5e5a329d6123c2bf4 WatchSource:0}: Error finding container 277b6635b4130ba8e73d546bc4ca84d0920a25205b6109d5e5a329d6123c2bf4: Status 404 returned error can't find the container with id 277b6635b4130ba8e73d546bc4ca84d0920a25205b6109d5e5a329d6123c2bf4 Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.171101 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerStarted","Data":"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1"} Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.172534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerStarted","Data":"ce1be94afdbfbe8d1b03c4c141a5c9bde276174c78ecea964eda5ec37736f3a5"} Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.172575 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerStarted","Data":"277b6635b4130ba8e73d546bc4ca84d0920a25205b6109d5e5a329d6123c2bf4"} Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.391373 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.391599 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6jr6p" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="registry-server" containerID="cri-o://3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" gracePeriod=2 Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.955795 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:45 crc kubenswrapper[4790]: I0406 12:10:45.955870 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:46 crc kubenswrapper[4790]: I0406 12:10:46.512871 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:46 crc kubenswrapper[4790]: I0406 12:10:46.512939 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.003276 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6rz5v" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:47 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:47 crc kubenswrapper[4790]: > Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.187231 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerID="ce1be94afdbfbe8d1b03c4c141a5c9bde276174c78ecea964eda5ec37736f3a5" exitCode=0 Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.187278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerDied","Data":"ce1be94afdbfbe8d1b03c4c141a5c9bde276174c78ecea964eda5ec37736f3a5"} Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.553974 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwlmr" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:47 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:47 crc kubenswrapper[4790]: > Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.729375 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:47 crc kubenswrapper[4790]: I0406 12:10:47.729422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:48 crc kubenswrapper[4790]: I0406 12:10:48.195145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerStarted","Data":"2351d42bea4af5ea2f26e91ad01e35e1213a498fb9755a1fb65daf6e7b2dd4cd"} Apr 06 12:10:48 crc kubenswrapper[4790]: I0406 12:10:48.768084 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmsvt" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:48 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:48 crc kubenswrapper[4790]: > Apr 06 12:10:48 crc kubenswrapper[4790]: I0406 12:10:48.916795 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:48 crc kubenswrapper[4790]: I0406 12:10:48.916900 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.188126 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.188613 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7gs7" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="registry-server" containerID="cri-o://c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" gracePeriod=2 Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.203249 4790 generic.go:334] "Generic (PLEG): container finished" podID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerID="e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1" exitCode=0 Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.203425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerDied","Data":"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1"} Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.272600 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]log ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]etcd excluded: ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]etcd-readiness excluded: ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]informer-sync ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-bootstrapclusterroles ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Apr 06 12:10:49 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Apr 06 12:10:49 crc kubenswrapper[4790]: [-]shutdown failed: reason withheld Apr 06 12:10:49 crc kubenswrapper[4790]: readyz check failed Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.273499 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 06 12:10:49 crc kubenswrapper[4790]: I0406 12:10:49.957577 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f4jjg" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:49 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:49 crc kubenswrapper[4790]: > Apr 06 12:10:50 crc kubenswrapper[4790]: I0406 12:10:50.162490 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:50 crc kubenswrapper[4790]: I0406 12:10:50.162772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:10:51 crc kubenswrapper[4790]: E0406 12:10:51.141045 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319 is running failed: container process not found" containerID="3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:51 crc kubenswrapper[4790]: E0406 12:10:51.141909 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319 is running failed: container process not found" containerID="3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:51 crc kubenswrapper[4790]: E0406 12:10:51.142491 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319 is running failed: container process not found" containerID="3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:51 crc kubenswrapper[4790]: E0406 12:10:51.142589 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-6jr6p" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="registry-server" Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.204981 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wgd2w" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="registry-server" probeResult="failure" output=< Apr 06 12:10:51 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:10:51 crc kubenswrapper[4790]: > Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.320544 4790 generic.go:334] "Generic (PLEG): container finished" podID="4efd618c-f882-4443-a843-cc58798d24c6" containerID="3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" exitCode=0 Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.320593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerDied","Data":"3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319"} Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.760565 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.938918 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tffv2\" (UniqueName: \"kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2\") pod \"4efd618c-f882-4443-a843-cc58798d24c6\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.939565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content\") pod \"4efd618c-f882-4443-a843-cc58798d24c6\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.939636 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities\") pod \"4efd618c-f882-4443-a843-cc58798d24c6\" (UID: \"4efd618c-f882-4443-a843-cc58798d24c6\") " Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.941125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities" (OuterVolumeSpecName: "utilities") pod "4efd618c-f882-4443-a843-cc58798d24c6" (UID: "4efd618c-f882-4443-a843-cc58798d24c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:10:51 crc kubenswrapper[4790]: I0406 12:10:51.950242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2" (OuterVolumeSpecName: "kube-api-access-tffv2") pod "4efd618c-f882-4443-a843-cc58798d24c6" (UID: "4efd618c-f882-4443-a843-cc58798d24c6"). InnerVolumeSpecName "kube-api-access-tffv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.022930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4efd618c-f882-4443-a843-cc58798d24c6" (UID: "4efd618c-f882-4443-a843-cc58798d24c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.041204 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.041238 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tffv2\" (UniqueName: \"kubernetes.io/projected/4efd618c-f882-4443-a843-cc58798d24c6-kube-api-access-tffv2\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.041252 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efd618c-f882-4443-a843-cc58798d24c6-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.337462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6jr6p" event={"ID":"4efd618c-f882-4443-a843-cc58798d24c6","Type":"ContainerDied","Data":"001c83959fc96953231bccb66cc982e521ee18749b478e1556b58039834bca79"} Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.337517 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6jr6p" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.337535 4790 scope.go:117] "RemoveContainer" containerID="3660bc4afcc3696de09bf4b9315bc8d0dbde8866be71a5cc8eeaeb302368a319" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.342697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerStarted","Data":"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2"} Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.346328 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerID="2351d42bea4af5ea2f26e91ad01e35e1213a498fb9755a1fb65daf6e7b2dd4cd" exitCode=0 Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.346391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerDied","Data":"2351d42bea4af5ea2f26e91ad01e35e1213a498fb9755a1fb65daf6e7b2dd4cd"} Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.369179 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxrr2" podStartSLOduration=2.787648327 podStartE2EDuration="11.369160837s" podCreationTimestamp="2026-04-06 12:10:41 +0000 UTC" firstStartedPulling="2026-04-06 12:10:43.155003173 +0000 UTC m=+822.142746029" lastFinishedPulling="2026-04-06 12:10:51.736515673 +0000 UTC m=+830.724258539" observedRunningTime="2026-04-06 12:10:52.369056674 +0000 UTC m=+831.356799580" watchObservedRunningTime="2026-04-06 12:10:52.369160837 +0000 UTC m=+831.356903693" Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.417299 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:52 crc kubenswrapper[4790]: I0406 12:10:52.422478 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6jr6p"] Apr 06 12:10:53 crc kubenswrapper[4790]: E0406 12:10:53.288644 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e is running failed: container process not found" containerID="c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:53 crc kubenswrapper[4790]: E0406 12:10:53.289520 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e is running failed: container process not found" containerID="c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:53 crc kubenswrapper[4790]: E0406 12:10:53.289945 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e is running failed: container process not found" containerID="c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:10:53 crc kubenswrapper[4790]: E0406 12:10:53.289977 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-n7gs7" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="registry-server" Apr 06 12:10:53 crc kubenswrapper[4790]: I0406 12:10:53.684017 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efd618c-f882-4443-a843-cc58798d24c6" path="/var/lib/kubelet/pods/4efd618c-f882-4443-a843-cc58798d24c6/volumes" Apr 06 12:10:54 crc kubenswrapper[4790]: I0406 12:10:54.267381 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:10:54 crc kubenswrapper[4790]: I0406 12:10:54.267720 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.003317 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.045291 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.076181 4790 generic.go:334] "Generic (PLEG): container finished" podID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerID="c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" exitCode=0 Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.076224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerDied","Data":"c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e"} Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.558426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:56 crc kubenswrapper[4790]: I0406 12:10:56.603398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.029145 4790 scope.go:117] "RemoveContainer" containerID="681f817bfcee797dbfc0f0e70c95cc39f8cc408739b7cc846cd0fcaad38fecd0" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.083589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7gs7" event={"ID":"2147c3d0-ea51-4143-a242-5cded03e7aa8","Type":"ContainerDied","Data":"a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05"} Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.083626 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4ea45729ed5350f6800404b25befb9b9a30443cd0ace9539c9fdcd737698f05" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.093616 4790 scope.go:117] "RemoveContainer" containerID="a30dd512339409121cb4760f44ad703caa01260ec380f9b29b4034e3cc0f5e5e" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.102269 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.209619 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95mdx\" (UniqueName: \"kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx\") pod \"2147c3d0-ea51-4143-a242-5cded03e7aa8\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.209667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities\") pod \"2147c3d0-ea51-4143-a242-5cded03e7aa8\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.209710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content\") pod \"2147c3d0-ea51-4143-a242-5cded03e7aa8\" (UID: \"2147c3d0-ea51-4143-a242-5cded03e7aa8\") " Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.212677 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities" (OuterVolumeSpecName: "utilities") pod "2147c3d0-ea51-4143-a242-5cded03e7aa8" (UID: "2147c3d0-ea51-4143-a242-5cded03e7aa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.217080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx" (OuterVolumeSpecName: "kube-api-access-95mdx") pod "2147c3d0-ea51-4143-a242-5cded03e7aa8" (UID: "2147c3d0-ea51-4143-a242-5cded03e7aa8"). InnerVolumeSpecName "kube-api-access-95mdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.242993 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2147c3d0-ea51-4143-a242-5cded03e7aa8" (UID: "2147c3d0-ea51-4143-a242-5cded03e7aa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.310979 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.311027 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95mdx\" (UniqueName: \"kubernetes.io/projected/2147c3d0-ea51-4143-a242-5cded03e7aa8-kube-api-access-95mdx\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.311042 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2147c3d0-ea51-4143-a242-5cded03e7aa8-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.778162 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:57 crc kubenswrapper[4790]: I0406 12:10:57.819858 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.093604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerStarted","Data":"5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415"} Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.095700 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7gs7" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.112202 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kg8bf" podStartSLOduration=4.956177274 podStartE2EDuration="15.112181741s" podCreationTimestamp="2026-04-06 12:10:43 +0000 UTC" firstStartedPulling="2026-04-06 12:10:47.18876953 +0000 UTC m=+826.176512396" lastFinishedPulling="2026-04-06 12:10:57.344773987 +0000 UTC m=+836.332516863" observedRunningTime="2026-04-06 12:10:58.111312678 +0000 UTC m=+837.099055564" watchObservedRunningTime="2026-04-06 12:10:58.112181741 +0000 UTC m=+837.099924607" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.128513 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.135169 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7gs7"] Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.396008 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.396479 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="extract-utilities" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.396598 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="extract-utilities" Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.396681 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.396733 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.396797 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="extract-utilities" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.396873 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="extract-utilities" Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.396937 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="extract-content" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.396992 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="extract-content" Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.397051 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="extract-content" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.397099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="extract-content" Apr 06 12:10:58 crc kubenswrapper[4790]: E0406 12:10:58.397155 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.397203 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.397353 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efd618c-f882-4443-a843-cc58798d24c6" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.397411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" containerName="registry-server" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.398266 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.405349 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.424443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmsz4\" (UniqueName: \"kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.424512 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.424533 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.528155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmsz4\" (UniqueName: \"kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.528277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.528300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.528818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.529459 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.552348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmsz4\" (UniqueName: \"kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4\") pod \"certified-operators-m8wfd\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.713544 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:10:58 crc kubenswrapper[4790]: I0406 12:10:58.964593 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:59 crc kubenswrapper[4790]: I0406 12:10:59.014471 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:10:59 crc kubenswrapper[4790]: I0406 12:10:59.152956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:10:59 crc kubenswrapper[4790]: W0406 12:10:59.155215 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58652e08_5d4e_4a76_b00d_79b1934afeb2.slice/crio-9e6fb38775f2b2ea08133adb53d98eb34eaff20e38bb6ec7639c18813c1ce6ec WatchSource:0}: Error finding container 9e6fb38775f2b2ea08133adb53d98eb34eaff20e38bb6ec7639c18813c1ce6ec: Status 404 returned error can't find the container with id 9e6fb38775f2b2ea08133adb53d98eb34eaff20e38bb6ec7639c18813c1ce6ec Apr 06 12:10:59 crc kubenswrapper[4790]: I0406 12:10:59.267104 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:10:59 crc kubenswrapper[4790]: I0406 12:10:59.267153 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:10:59 crc kubenswrapper[4790]: I0406 12:10:59.684529 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2147c3d0-ea51-4143-a242-5cded03e7aa8" path="/var/lib/kubelet/pods/2147c3d0-ea51-4143-a242-5cded03e7aa8/volumes" Apr 06 12:11:00 crc kubenswrapper[4790]: I0406 12:11:00.109961 4790 generic.go:334] "Generic (PLEG): container finished" podID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerID="2374825ead498d23cc2caf030f89c1dc980b99577763f94ec5e83c236d2c0e8f" exitCode=0 Apr 06 12:11:00 crc kubenswrapper[4790]: I0406 12:11:00.110009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerDied","Data":"2374825ead498d23cc2caf030f89c1dc980b99577763f94ec5e83c236d2c0e8f"} Apr 06 12:11:00 crc kubenswrapper[4790]: I0406 12:11:00.110033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerStarted","Data":"9e6fb38775f2b2ea08133adb53d98eb34eaff20e38bb6ec7639c18813c1ce6ec"} Apr 06 12:11:00 crc kubenswrapper[4790]: I0406 12:11:00.218898 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:11:00 crc kubenswrapper[4790]: I0406 12:11:00.317509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.118610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerStarted","Data":"ef2c129f0eb4cbfb416bbc994b4d9b271f4c6a634c68f683766ecaa34b04938f"} Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.606593 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.608185 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.633483 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.777159 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.777441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cxp\" (UniqueName: \"kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.777497 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.878545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cxp\" (UniqueName: \"kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.878615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.878653 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.879170 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.879367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.913872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cxp\" (UniqueName: \"kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp\") pod \"redhat-marketplace-5kbct\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.925529 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.931036 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:01 crc kubenswrapper[4790]: I0406 12:11:01.931319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.015087 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.146010 4790 generic.go:334] "Generic (PLEG): container finished" podID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerID="ef2c129f0eb4cbfb416bbc994b4d9b271f4c6a634c68f683766ecaa34b04938f" exitCode=0 Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.147264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerDied","Data":"ef2c129f0eb4cbfb416bbc994b4d9b271f4c6a634c68f683766ecaa34b04938f"} Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.155730 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.243404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.369786 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:02 crc kubenswrapper[4790]: W0406 12:11:02.387790 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aae3cb7_5ff6_4400_a80e_42f87496da8b.slice/crio-aa4e31fd2f89394cd37441d869b34e50d813bba8a7f73336339ad9d5c8cbabb6 WatchSource:0}: Error finding container aa4e31fd2f89394cd37441d869b34e50d813bba8a7f73336339ad9d5c8cbabb6: Status 404 returned error can't find the container with id aa4e31fd2f89394cd37441d869b34e50d813bba8a7f73336339ad9d5c8cbabb6 Apr 06 12:11:02 crc kubenswrapper[4790]: I0406 12:11:02.635906 4790 scope.go:117] "RemoveContainer" containerID="b83253e28152a8b6b68894e72c8011aa97f31b5954d89eddf33dffebef74d1c9" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.154559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerStarted","Data":"4423aa43e05909e7bb3fc4ac4401ea170db5692005b2f5bd91e286f0925e45a0"} Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.156368 4790 generic.go:334] "Generic (PLEG): container finished" podID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerID="bb1bbf5eb0dd117774b710488e1ef80abb88f2e79f5029a571a28f1fc9c49a79" exitCode=0 Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.157363 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerDied","Data":"bb1bbf5eb0dd117774b710488e1ef80abb88f2e79f5029a571a28f1fc9c49a79"} Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.157386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerStarted","Data":"aa4e31fd2f89394cd37441d869b34e50d813bba8a7f73336339ad9d5c8cbabb6"} Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.299311 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m8wfd" podStartSLOduration=2.698023753 podStartE2EDuration="5.299288959s" podCreationTimestamp="2026-04-06 12:10:58 +0000 UTC" firstStartedPulling="2026-04-06 12:11:00.112152579 +0000 UTC m=+839.099895455" lastFinishedPulling="2026-04-06 12:11:02.713417795 +0000 UTC m=+841.701160661" observedRunningTime="2026-04-06 12:11:03.224160375 +0000 UTC m=+842.211903251" watchObservedRunningTime="2026-04-06 12:11:03.299288959 +0000 UTC m=+842.287031825" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.586927 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8"] Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.588489 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.590786 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.646885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8"] Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.723481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq4pq\" (UniqueName: \"kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.723574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.723617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.824427 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.824492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.824530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq4pq\" (UniqueName: \"kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.825276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.825373 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.855878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq4pq\" (UniqueName: \"kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:03 crc kubenswrapper[4790]: I0406 12:11:03.947334 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.120958 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.122911 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.212163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerStarted","Data":"0c4b252c65a7fc09f5c2e6d5d207255192d49ca8cbf632de76aed097cfd98e67"} Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.226279 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.268950 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8"] Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.274978 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:04 crc kubenswrapper[4790]: I0406 12:11:04.275041 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.191880 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.192478 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwlmr" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="registry-server" containerID="cri-o://529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14" gracePeriod=2 Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.218284 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerID="2c50577d98de99d9076a83c2fbf29ba6e6d5e844c00f0c7c24f125b3423f9887" exitCode=0 Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.218347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerDied","Data":"2c50577d98de99d9076a83c2fbf29ba6e6d5e844c00f0c7c24f125b3423f9887"} Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.218396 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerStarted","Data":"46c4dcaf5fa2cf7967f04cd94bdb8fd5645f0b9e74303a873ebc06f16534ed55"} Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.221070 4790 generic.go:334] "Generic (PLEG): container finished" podID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerID="0c4b252c65a7fc09f5c2e6d5d207255192d49ca8cbf632de76aed097cfd98e67" exitCode=0 Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.221162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerDied","Data":"0c4b252c65a7fc09f5c2e6d5d207255192d49ca8cbf632de76aed097cfd98e67"} Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.224189 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.225000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.230205 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.230343 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.251093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.285899 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.347507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqcv\" (UniqueName: \"kubernetes.io/projected/ae41d528-6f4b-45e5-84f2-5d9eae998759-kube-api-access-pzqcv\") pod \"obo-prometheus-operator-86dff4bf76-bm9zd\" (UID: \"ae41d528-6f4b-45e5-84f2-5d9eae998759\") " pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.349865 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.350656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.354091 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.359025 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.359881 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.367781 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.379728 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.449540 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.449583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.449661 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.449680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.449704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqcv\" (UniqueName: \"kubernetes.io/projected/ae41d528-6f4b-45e5-84f2-5d9eae998759-kube-api-access-pzqcv\") pod \"obo-prometheus-operator-86dff4bf76-bm9zd\" (UID: \"ae41d528-6f4b-45e5-84f2-5d9eae998759\") " pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.498959 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqcv\" (UniqueName: \"kubernetes.io/projected/ae41d528-6f4b-45e5-84f2-5d9eae998759-kube-api-access-pzqcv\") pod \"obo-prometheus-operator-86dff4bf76-bm9zd\" (UID: \"ae41d528-6f4b-45e5-84f2-5d9eae998759\") " pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.547372 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.553561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.553603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.553624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.553642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.570901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.573521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.573529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06f8ee69-3814-40ed-8ed2-5913509658de-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-pw54j\" (UID: \"06f8ee69-3814-40ed-8ed2-5913509658de\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.576377 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ea4918c9-2a05-4c75-9d68-662e0a0fc175-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw\" (UID: \"ea4918c9-2a05-4c75-9d68-662e0a0fc175\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.615206 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-dd944d769-7fscl"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.615997 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.625357 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.654326 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d990eb66-396e-4b05-acab-eaa30a6fbd34-observability-operator-tls\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.654420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x62dx\" (UniqueName: \"kubernetes.io/projected/d990eb66-396e-4b05-acab-eaa30a6fbd34-kube-api-access-x62dx\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.662866 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-dd944d769-7fscl"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.673756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.690099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.747233 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.760550 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x62dx\" (UniqueName: \"kubernetes.io/projected/d990eb66-396e-4b05-acab-eaa30a6fbd34-kube-api-access-x62dx\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.760633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d990eb66-396e-4b05-acab-eaa30a6fbd34-observability-operator-tls\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.776496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d990eb66-396e-4b05-acab-eaa30a6fbd34-observability-operator-tls\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.795114 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.795346 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6rz5v" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" containerID="cri-o://7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" gracePeriod=2 Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.809946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x62dx\" (UniqueName: \"kubernetes.io/projected/d990eb66-396e-4b05-acab-eaa30a6fbd34-kube-api-access-x62dx\") pod \"observability-operator-dd944d769-7fscl\" (UID: \"d990eb66-396e-4b05-acab-eaa30a6fbd34\") " pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.865105 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65z7\" (UniqueName: \"kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7\") pod \"1c18cf73-84b8-4a06-b942-4ec437124f79\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.865236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities\") pod \"1c18cf73-84b8-4a06-b942-4ec437124f79\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.865290 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content\") pod \"1c18cf73-84b8-4a06-b942-4ec437124f79\" (UID: \"1c18cf73-84b8-4a06-b942-4ec437124f79\") " Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.866670 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities" (OuterVolumeSpecName: "utilities") pod "1c18cf73-84b8-4a06-b942-4ec437124f79" (UID: "1c18cf73-84b8-4a06-b942-4ec437124f79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.875611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7" (OuterVolumeSpecName: "kube-api-access-m65z7") pod "1c18cf73-84b8-4a06-b942-4ec437124f79" (UID: "1c18cf73-84b8-4a06-b942-4ec437124f79"). InnerVolumeSpecName "kube-api-access-m65z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.877895 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-74445bf4b8-spbpr"] Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.878211 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="registry-server" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.878223 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="registry-server" Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.878233 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="extract-utilities" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.878240 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="extract-utilities" Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.878258 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="extract-content" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.878264 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="extract-content" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.878389 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerName="registry-server" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.878805 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.885778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-74445bf4b8-spbpr"] Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.962217 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1 is running failed: container process not found" containerID="7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.966787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl9f\" (UniqueName: \"kubernetes.io/projected/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-kube-api-access-qpl9f\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.967281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-openshift-service-ca\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.967333 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.967346 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65z7\" (UniqueName: \"kubernetes.io/projected/1c18cf73-84b8-4a06-b942-4ec437124f79-kube-api-access-m65z7\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.978337 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1 is running failed: container process not found" containerID="7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.981439 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1 is running failed: container process not found" containerID="7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:05 crc kubenswrapper[4790]: E0406 12:11:05.981495 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-6rz5v" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" Apr 06 12:11:05 crc kubenswrapper[4790]: I0406 12:11:05.991171 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.069516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-openshift-service-ca\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.069601 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl9f\" (UniqueName: \"kubernetes.io/projected/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-kube-api-access-qpl9f\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.071387 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-openshift-service-ca\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.093341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl9f\" (UniqueName: \"kubernetes.io/projected/0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686-kube-api-access-qpl9f\") pod \"perses-operator-74445bf4b8-spbpr\" (UID: \"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686\") " pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.140547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c18cf73-84b8-4a06-b942-4ec437124f79" (UID: "1c18cf73-84b8-4a06-b942-4ec437124f79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.143659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd"] Apr 06 12:11:06 crc kubenswrapper[4790]: W0406 12:11:06.149766 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae41d528_6f4b_45e5_84f2_5d9eae998759.slice/crio-0972409b2486e6d3972e75a437314200753b931a0904e1fa8f09dffb7433294a WatchSource:0}: Error finding container 0972409b2486e6d3972e75a437314200753b931a0904e1fa8f09dffb7433294a: Status 404 returned error can't find the container with id 0972409b2486e6d3972e75a437314200753b931a0904e1fa8f09dffb7433294a Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.171493 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18cf73-84b8-4a06-b942-4ec437124f79-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.239497 4790 generic.go:334] "Generic (PLEG): container finished" podID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerID="7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" exitCode=0 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.239563 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerDied","Data":"7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1"} Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.241919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerStarted","Data":"2c5aad2da3353e51f42798337c3b0090be7f6a81b4593e1ae98befc8e49a3829"} Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.251666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" event={"ID":"ae41d528-6f4b-45e5-84f2-5d9eae998759","Type":"ContainerStarted","Data":"0972409b2486e6d3972e75a437314200753b931a0904e1fa8f09dffb7433294a"} Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.254301 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c18cf73-84b8-4a06-b942-4ec437124f79" containerID="529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14" exitCode=0 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.255377 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwlmr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.259770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerDied","Data":"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14"} Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.259803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwlmr" event={"ID":"1c18cf73-84b8-4a06-b942-4ec437124f79","Type":"ContainerDied","Data":"0ab86dba8b7627f743c582362efdb3de48564272b5bc588bcb3c966b33396005"} Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.259840 4790 scope.go:117] "RemoveContainer" containerID="529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.262772 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5kbct" podStartSLOduration=2.738511404 podStartE2EDuration="5.262759068s" podCreationTimestamp="2026-04-06 12:11:01 +0000 UTC" firstStartedPulling="2026-04-06 12:11:03.15827772 +0000 UTC m=+842.146020586" lastFinishedPulling="2026-04-06 12:11:05.682525374 +0000 UTC m=+844.670268250" observedRunningTime="2026-04-06 12:11:06.261533625 +0000 UTC m=+845.249276501" watchObservedRunningTime="2026-04-06 12:11:06.262759068 +0000 UTC m=+845.250501934" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.268617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.304371 4790 scope.go:117] "RemoveContainer" containerID="931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.308890 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.321420 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwlmr"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.384467 4790 scope.go:117] "RemoveContainer" containerID="dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.398960 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.399215 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgd2w" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="registry-server" containerID="cri-o://f39af18abc79914046ebf247535ef1eca86e1f9c512f4a348b434ae9a5d6995c" gracePeriod=2 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.436174 4790 scope.go:117] "RemoveContainer" containerID="529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14" Apr 06 12:11:06 crc kubenswrapper[4790]: E0406 12:11:06.455046 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14\": container with ID starting with 529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14 not found: ID does not exist" containerID="529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.455113 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14"} err="failed to get container status \"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14\": rpc error: code = NotFound desc = could not find container \"529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14\": container with ID starting with 529c90d0704862c17b4779e503083b35c69e1271229c641b3ec3995adbcaaf14 not found: ID does not exist" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.455154 4790 scope.go:117] "RemoveContainer" containerID="931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765" Apr 06 12:11:06 crc kubenswrapper[4790]: E0406 12:11:06.463793 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765\": container with ID starting with 931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765 not found: ID does not exist" containerID="931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.463895 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765"} err="failed to get container status \"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765\": rpc error: code = NotFound desc = could not find container \"931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765\": container with ID starting with 931a7718cb4a70296d74716bd14bdd65afaca615ce9d32eb508419902611e765 not found: ID does not exist" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.463949 4790 scope.go:117] "RemoveContainer" containerID="dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52" Apr 06 12:11:06 crc kubenswrapper[4790]: E0406 12:11:06.482665 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52\": container with ID starting with dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52 not found: ID does not exist" containerID="dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.482738 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52"} err="failed to get container status \"dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52\": rpc error: code = NotFound desc = could not find container \"dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52\": container with ID starting with dbd303faf707006341ed389b80db77bd03f54b81b7c523dd0452a929ea600d52 not found: ID does not exist" Apr 06 12:11:06 crc kubenswrapper[4790]: W0406 12:11:06.490739 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4918c9_2a05_4c75_9d68_662e0a0fc175.slice/crio-63bcc9ff90aca03223f4a9c48ce765c0f3d92717465c8c6487cdb74ebfb43be6 WatchSource:0}: Error finding container 63bcc9ff90aca03223f4a9c48ce765c0f3d92717465c8c6487cdb74ebfb43be6: Status 404 returned error can't find the container with id 63bcc9ff90aca03223f4a9c48ce765c0f3d92717465c8c6487cdb74ebfb43be6 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.491414 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.533357 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.656077 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.700218 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-dd944d769-7fscl"] Apr 06 12:11:06 crc kubenswrapper[4790]: W0406 12:11:06.706640 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd990eb66_396e_4b05_acab_eaa30a6fbd34.slice/crio-41d00d19f89516534654fd93184c6196e99b264286ddeb79dc6ea5127c5f3341 WatchSource:0}: Error finding container 41d00d19f89516534654fd93184c6196e99b264286ddeb79dc6ea5127c5f3341: Status 404 returned error can't find the container with id 41d00d19f89516534654fd93184c6196e99b264286ddeb79dc6ea5127c5f3341 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.795361 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities\") pod \"7781c53a-5217-44ce-9a47-671e951b9c7e\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.795412 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content\") pod \"7781c53a-5217-44ce-9a47-671e951b9c7e\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.795463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54d8l\" (UniqueName: \"kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l\") pod \"7781c53a-5217-44ce-9a47-671e951b9c7e\" (UID: \"7781c53a-5217-44ce-9a47-671e951b9c7e\") " Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.797185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities" (OuterVolumeSpecName: "utilities") pod "7781c53a-5217-44ce-9a47-671e951b9c7e" (UID: "7781c53a-5217-44ce-9a47-671e951b9c7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.804298 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l" (OuterVolumeSpecName: "kube-api-access-54d8l") pod "7781c53a-5217-44ce-9a47-671e951b9c7e" (UID: "7781c53a-5217-44ce-9a47-671e951b9c7e"). InnerVolumeSpecName "kube-api-access-54d8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.902610 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.902657 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54d8l\" (UniqueName: \"kubernetes.io/projected/7781c53a-5217-44ce-9a47-671e951b9c7e-kube-api-access-54d8l\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.960145 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7781c53a-5217-44ce-9a47-671e951b9c7e" (UID: "7781c53a-5217-44ce-9a47-671e951b9c7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.963192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-74445bf4b8-spbpr"] Apr 06 12:11:06 crc kubenswrapper[4790]: W0406 12:11:06.966787 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e6e8ad0_0549_4bcd_b2a7_4cbc5fc67686.slice/crio-eca65505edc3e968293cd8751b4d285fb3a25673e5bf157d41918988804eb2c6 WatchSource:0}: Error finding container eca65505edc3e968293cd8751b4d285fb3a25673e5bf157d41918988804eb2c6: Status 404 returned error can't find the container with id eca65505edc3e968293cd8751b4d285fb3a25673e5bf157d41918988804eb2c6 Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.993616 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:11:06 crc kubenswrapper[4790]: I0406 12:11:06.993860 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmsvt" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" containerID="cri-o://aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" gracePeriod=2 Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.003943 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7781c53a-5217-44ce-9a47-671e951b9c7e-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.274866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" event={"ID":"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686","Type":"ContainerStarted","Data":"eca65505edc3e968293cd8751b4d285fb3a25673e5bf157d41918988804eb2c6"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.322165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6rz5v" event={"ID":"7781c53a-5217-44ce-9a47-671e951b9c7e","Type":"ContainerDied","Data":"6e834f6e7fe6fa4bd024b823ef687544495503871fd60b82b378452777d33172"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.322220 4790 scope.go:117] "RemoveContainer" containerID="7b94f46ce52178d547da3a4758ec37c0ed786bcf1b9c9adbe0866bf7bd51fed1" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.322322 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6rz5v" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.365174 4790 generic.go:334] "Generic (PLEG): container finished" podID="42617262-5255-4f88-8afa-9d93a640090e" containerID="f39af18abc79914046ebf247535ef1eca86e1f9c512f4a348b434ae9a5d6995c" exitCode=0 Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.365281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerDied","Data":"f39af18abc79914046ebf247535ef1eca86e1f9c512f4a348b434ae9a5d6995c"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.370468 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.376406 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6rz5v"] Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.377967 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" event={"ID":"06f8ee69-3814-40ed-8ed2-5913509658de","Type":"ContainerStarted","Data":"50a47f38e20d2117281a52efeff6d4f2e83bb4edaa472d4af3f33ecce1f91461"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.388958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-dd944d769-7fscl" event={"ID":"d990eb66-396e-4b05-acab-eaa30a6fbd34","Type":"ContainerStarted","Data":"41d00d19f89516534654fd93184c6196e99b264286ddeb79dc6ea5127c5f3341"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.401471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" event={"ID":"ea4918c9-2a05-4c75-9d68-662e0a0fc175","Type":"ContainerStarted","Data":"63bcc9ff90aca03223f4a9c48ce765c0f3d92717465c8c6487cdb74ebfb43be6"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.414315 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerID="aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" exitCode=0 Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.414381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerDied","Data":"aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941"} Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.596743 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.597018 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mqhc6" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="registry-server" containerID="cri-o://70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8" gracePeriod=2 Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.633819 4790 scope.go:117] "RemoveContainer" containerID="303701940f565d768185a2992e94f1917d698b3d496f6d3cdf493e0bdea36c0b" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.697631 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c18cf73-84b8-4a06-b942-4ec437124f79" path="/var/lib/kubelet/pods/1c18cf73-84b8-4a06-b942-4ec437124f79/volumes" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.698852 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" path="/var/lib/kubelet/pods/7781c53a-5217-44ce-9a47-671e951b9c7e/volumes" Apr 06 12:11:07 crc kubenswrapper[4790]: E0406 12:11:07.730634 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941 is running failed: container process not found" containerID="aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:07 crc kubenswrapper[4790]: E0406 12:11:07.731121 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941 is running failed: container process not found" containerID="aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:07 crc kubenswrapper[4790]: E0406 12:11:07.731962 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941 is running failed: container process not found" containerID="aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:07 crc kubenswrapper[4790]: E0406 12:11:07.732050 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-xmsvt" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.764994 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.797518 4790 scope.go:117] "RemoveContainer" containerID="9242efd38b17b077246c444c8d03b383c69900f340afad74fa8b003c643e3ba6" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.830482 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") pod \"42617262-5255-4f88-8afa-9d93a640090e\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.830604 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vl28\" (UniqueName: \"kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28\") pod \"42617262-5255-4f88-8afa-9d93a640090e\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.830697 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities\") pod \"42617262-5255-4f88-8afa-9d93a640090e\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.831800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities" (OuterVolumeSpecName: "utilities") pod "42617262-5255-4f88-8afa-9d93a640090e" (UID: "42617262-5255-4f88-8afa-9d93a640090e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.837275 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28" (OuterVolumeSpecName: "kube-api-access-5vl28") pod "42617262-5255-4f88-8afa-9d93a640090e" (UID: "42617262-5255-4f88-8afa-9d93a640090e"). InnerVolumeSpecName "kube-api-access-5vl28". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.933290 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vl28\" (UniqueName: \"kubernetes.io/projected/42617262-5255-4f88-8afa-9d93a640090e-kube-api-access-5vl28\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.933402 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:07 crc kubenswrapper[4790]: I0406 12:11:07.981403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.035200 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42617262-5255-4f88-8afa-9d93a640090e" (UID: "42617262-5255-4f88-8afa-9d93a640090e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.035336 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content\") pod \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.035395 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6664\" (UniqueName: \"kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664\") pod \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.035471 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") pod \"42617262-5255-4f88-8afa-9d93a640090e\" (UID: \"42617262-5255-4f88-8afa-9d93a640090e\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.035519 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities\") pod \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\" (UID: \"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af\") " Apr 06 12:11:08 crc kubenswrapper[4790]: W0406 12:11:08.036258 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/42617262-5255-4f88-8afa-9d93a640090e/volumes/kubernetes.io~empty-dir/catalog-content Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.036277 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42617262-5255-4f88-8afa-9d93a640090e" (UID: "42617262-5255-4f88-8afa-9d93a640090e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.036813 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities" (OuterVolumeSpecName: "utilities") pod "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" (UID: "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.039324 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664" (OuterVolumeSpecName: "kube-api-access-l6664") pod "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" (UID: "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af"). InnerVolumeSpecName "kube-api-access-l6664". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.111658 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.137127 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6664\" (UniqueName: \"kubernetes.io/projected/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-kube-api-access-l6664\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.137184 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42617262-5255-4f88-8afa-9d93a640090e-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.137198 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.184980 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" (UID: "4ff0634b-ac6b-4c16-a72f-9bf81b54b1af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.191952 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.192235 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f4jjg" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="registry-server" containerID="cri-o://ed76fa8292a4bfd378bd64b7ab5e25a405d3a61ea7279296bc6831f0cd8d8f85" gracePeriod=2 Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.238382 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content\") pod \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.238434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g9vc\" (UniqueName: \"kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc\") pod \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.238506 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities\") pod \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\" (UID: \"1c342bf5-bb1e-43e0-a549-e73112fa4e4a\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.238748 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.239477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities" (OuterVolumeSpecName: "utilities") pod "1c342bf5-bb1e-43e0-a549-e73112fa4e4a" (UID: "1c342bf5-bb1e-43e0-a549-e73112fa4e4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.243415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc" (OuterVolumeSpecName: "kube-api-access-5g9vc") pod "1c342bf5-bb1e-43e0-a549-e73112fa4e4a" (UID: "1c342bf5-bb1e-43e0-a549-e73112fa4e4a"). InnerVolumeSpecName "kube-api-access-5g9vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.341777 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g9vc\" (UniqueName: \"kubernetes.io/projected/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-kube-api-access-5g9vc\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.341819 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.417266 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c342bf5-bb1e-43e0-a549-e73112fa4e4a" (UID: "1c342bf5-bb1e-43e0-a549-e73112fa4e4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.446188 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c342bf5-bb1e-43e0-a549-e73112fa4e4a-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.450652 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerID="70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8" exitCode=0 Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.450737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerDied","Data":"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.450777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqhc6" event={"ID":"1c342bf5-bb1e-43e0-a549-e73112fa4e4a","Type":"ContainerDied","Data":"e9f2cf9e348cdd9f16e9e5a7839a90595e829a1fb1d8f3834c69b02fecd06e48"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.450801 4790 scope.go:117] "RemoveContainer" containerID="70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.450959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqhc6" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.465434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmsvt" event={"ID":"4ff0634b-ac6b-4c16-a72f-9bf81b54b1af","Type":"ContainerDied","Data":"0f33157daa5b1e3311a4a2c0d397d9dc90a1d6e5e690a44245fef040eb219d63"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.465793 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmsvt" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.482552 4790 generic.go:334] "Generic (PLEG): container finished" podID="a1fd968f-f454-4241-afa5-e47623a01c84" containerID="ed76fa8292a4bfd378bd64b7ab5e25a405d3a61ea7279296bc6831f0cd8d8f85" exitCode=0 Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.482605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerDied","Data":"ed76fa8292a4bfd378bd64b7ab5e25a405d3a61ea7279296bc6831f0cd8d8f85"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.485209 4790 scope.go:117] "RemoveContainer" containerID="806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.495036 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerID="3a8c47a32ab14cb6999a85ca163e0891030ac7d95bad9809fb174e0ea5756b27" exitCode=0 Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.495117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerDied","Data":"3a8c47a32ab14cb6999a85ca163e0891030ac7d95bad9809fb174e0ea5756b27"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.499158 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgd2w" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.504540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgd2w" event={"ID":"42617262-5255-4f88-8afa-9d93a640090e","Type":"ContainerDied","Data":"52455297e83a5f3775ab0b0e267496bcee17a7de69fb9a36004bb4abbb4108f2"} Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.504612 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.505628 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mqhc6"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.520762 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.532296 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmsvt"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.576761 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.589734 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgd2w"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.668058 4790 scope.go:117] "RemoveContainer" containerID="d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.714512 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.731556 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.745205 4790 scope.go:117] "RemoveContainer" containerID="70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8" Apr 06 12:11:08 crc kubenswrapper[4790]: E0406 12:11:08.755980 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8\": container with ID starting with 70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8 not found: ID does not exist" containerID="70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.756029 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8"} err="failed to get container status \"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8\": rpc error: code = NotFound desc = could not find container \"70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8\": container with ID starting with 70be9a2aae72bd7f22a9bf4cdb5dca5bbd95375e28476ebf06842c0f226f06f8 not found: ID does not exist" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.756060 4790 scope.go:117] "RemoveContainer" containerID="806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133" Apr 06 12:11:08 crc kubenswrapper[4790]: E0406 12:11:08.756947 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133\": container with ID starting with 806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133 not found: ID does not exist" containerID="806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.756975 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133"} err="failed to get container status \"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133\": rpc error: code = NotFound desc = could not find container \"806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133\": container with ID starting with 806baa36d1c8e0b51c672da8230b56b688cae4e6007e2a02c0115150ded49133 not found: ID does not exist" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.756991 4790 scope.go:117] "RemoveContainer" containerID="d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.757022 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:11:08 crc kubenswrapper[4790]: E0406 12:11:08.758247 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71\": container with ID starting with d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71 not found: ID does not exist" containerID="d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.758303 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71"} err="failed to get container status \"d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71\": rpc error: code = NotFound desc = could not find container \"d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71\": container with ID starting with d14c4701541cc7180a487771a8dd572b80dec406bc9596f6c74318e89cb22d71 not found: ID does not exist" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.758343 4790 scope.go:117] "RemoveContainer" containerID="aa862a4532cc1995501c17164ef8be74ee8e467212c363b9201b74ad77f2c941" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.815319 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.815732 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxrr2" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="registry-server" containerID="cri-o://a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2" gracePeriod=2 Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.841439 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.852305 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities\") pod \"a1fd968f-f454-4241-afa5-e47623a01c84\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.852399 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpwrt\" (UniqueName: \"kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt\") pod \"a1fd968f-f454-4241-afa5-e47623a01c84\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.852441 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content\") pod \"a1fd968f-f454-4241-afa5-e47623a01c84\" (UID: \"a1fd968f-f454-4241-afa5-e47623a01c84\") " Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.854055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities" (OuterVolumeSpecName: "utilities") pod "a1fd968f-f454-4241-afa5-e47623a01c84" (UID: "a1fd968f-f454-4241-afa5-e47623a01c84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.862974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt" (OuterVolumeSpecName: "kube-api-access-lpwrt") pod "a1fd968f-f454-4241-afa5-e47623a01c84" (UID: "a1fd968f-f454-4241-afa5-e47623a01c84"). InnerVolumeSpecName "kube-api-access-lpwrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.870149 4790 scope.go:117] "RemoveContainer" containerID="7f7bd39f243316c503a0c53221edb8b168b6e99f503d937be1400c0cdacac95e" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.954199 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:08 crc kubenswrapper[4790]: I0406 12:11:08.954498 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpwrt\" (UniqueName: \"kubernetes.io/projected/a1fd968f-f454-4241-afa5-e47623a01c84-kube-api-access-lpwrt\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.002987 4790 scope.go:117] "RemoveContainer" containerID="d0bd7e699ebc34948bccd2003a638d96e1f8fffee6b4dc8a8637914b2a0aa227" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.009641 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1fd968f-f454-4241-afa5-e47623a01c84" (UID: "a1fd968f-f454-4241-afa5-e47623a01c84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.056108 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd968f-f454-4241-afa5-e47623a01c84-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.083736 4790 scope.go:117] "RemoveContainer" containerID="f39af18abc79914046ebf247535ef1eca86e1f9c512f4a348b434ae9a5d6995c" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.119510 4790 scope.go:117] "RemoveContainer" containerID="dd6d9624d08cf9e4b71e379501eb980d6b5eae20b3ab1465244989e466c7387f" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.208480 4790 scope.go:117] "RemoveContainer" containerID="f720b7549895e95d1e70544146d1972bcf0008de33562c766ca0656c1a1ce270" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.270467 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.270519 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.336350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.463007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities\") pod \"8de0f178-000e-4a30-84bc-c1f33e965d0a\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.463062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx6xv\" (UniqueName: \"kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv\") pod \"8de0f178-000e-4a30-84bc-c1f33e965d0a\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.463112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content\") pod \"8de0f178-000e-4a30-84bc-c1f33e965d0a\" (UID: \"8de0f178-000e-4a30-84bc-c1f33e965d0a\") " Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.464953 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities" (OuterVolumeSpecName: "utilities") pod "8de0f178-000e-4a30-84bc-c1f33e965d0a" (UID: "8de0f178-000e-4a30-84bc-c1f33e965d0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.471288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv" (OuterVolumeSpecName: "kube-api-access-rx6xv") pod "8de0f178-000e-4a30-84bc-c1f33e965d0a" (UID: "8de0f178-000e-4a30-84bc-c1f33e965d0a"). InnerVolumeSpecName "kube-api-access-rx6xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.521819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4jjg" event={"ID":"a1fd968f-f454-4241-afa5-e47623a01c84","Type":"ContainerDied","Data":"9e00577f57f18d7b2285859341cb8b8383696f77832db39714919a94c6e34e32"} Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.521900 4790 scope.go:117] "RemoveContainer" containerID="ed76fa8292a4bfd378bd64b7ab5e25a405d3a61ea7279296bc6831f0cd8d8f85" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.521972 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4jjg" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.526198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerStarted","Data":"b01e0d5e3a92e575caa8457a3b271166fb33a802847c7c83575d2212c9866d09"} Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.555367 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" podStartSLOduration=4.103292073 podStartE2EDuration="6.555353884s" podCreationTimestamp="2026-04-06 12:11:03 +0000 UTC" firstStartedPulling="2026-04-06 12:11:05.220684484 +0000 UTC m=+844.208427360" lastFinishedPulling="2026-04-06 12:11:07.672746305 +0000 UTC m=+846.660489171" observedRunningTime="2026-04-06 12:11:09.551653433 +0000 UTC m=+848.539396299" watchObservedRunningTime="2026-04-06 12:11:09.555353884 +0000 UTC m=+848.543096750" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.558264 4790 generic.go:334] "Generic (PLEG): container finished" podID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerID="a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2" exitCode=0 Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.559103 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxrr2" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.559267 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerDied","Data":"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2"} Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.559303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxrr2" event={"ID":"8de0f178-000e-4a30-84bc-c1f33e965d0a","Type":"ContainerDied","Data":"66e39c64c1fbdecd39f940a17ea09357c6739fee18a0b6387fff9356ff3a2e56"} Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.572157 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.573143 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.573211 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx6xv\" (UniqueName: \"kubernetes.io/projected/8de0f178-000e-4a30-84bc-c1f33e965d0a-kube-api-access-rx6xv\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.580868 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f4jjg"] Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.592817 4790 scope.go:117] "RemoveContainer" containerID="8b422088fa7a70fbb8cfff5df3f5c0d80243ce4c3e0ce85d9048f0496a571d75" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.646418 4790 scope.go:117] "RemoveContainer" containerID="cfdd6404e53527425721b0bab9e3aa4b2d876ea8b8d45fbf9775f85791c8660f" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.656573 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.659682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8de0f178-000e-4a30-84bc-c1f33e965d0a" (UID: "8de0f178-000e-4a30-84bc-c1f33e965d0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.675028 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de0f178-000e-4a30-84bc-c1f33e965d0a-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.686433 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" path="/var/lib/kubelet/pods/1c342bf5-bb1e-43e0-a549-e73112fa4e4a/volumes" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.687264 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42617262-5255-4f88-8afa-9d93a640090e" path="/var/lib/kubelet/pods/42617262-5255-4f88-8afa-9d93a640090e/volumes" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.687986 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" path="/var/lib/kubelet/pods/4ff0634b-ac6b-4c16-a72f-9bf81b54b1af/volumes" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.689568 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" path="/var/lib/kubelet/pods/a1fd968f-f454-4241-afa5-e47623a01c84/volumes" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.692710 4790 scope.go:117] "RemoveContainer" containerID="a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.729044 4790 scope.go:117] "RemoveContainer" containerID="e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.756177 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.756238 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.777379 4790 scope.go:117] "RemoveContainer" containerID="c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.823876 4790 scope.go:117] "RemoveContainer" containerID="a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2" Apr 06 12:11:09 crc kubenswrapper[4790]: E0406 12:11:09.824282 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2\": container with ID starting with a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2 not found: ID does not exist" containerID="a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.824323 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2"} err="failed to get container status \"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2\": rpc error: code = NotFound desc = could not find container \"a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2\": container with ID starting with a749a76de5f00f48cc87bfa94565b7aa2a10389e87d1bebe3396abf5645f07c2 not found: ID does not exist" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.824351 4790 scope.go:117] "RemoveContainer" containerID="e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1" Apr 06 12:11:09 crc kubenswrapper[4790]: E0406 12:11:09.824633 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1\": container with ID starting with e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1 not found: ID does not exist" containerID="e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.824664 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1"} err="failed to get container status \"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1\": rpc error: code = NotFound desc = could not find container \"e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1\": container with ID starting with e8bdf3daef6afbc5bef1131711696c9ba59c2e54d0f13d69da3886905c7ee9a1 not found: ID does not exist" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.824686 4790 scope.go:117] "RemoveContainer" containerID="c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799" Apr 06 12:11:09 crc kubenswrapper[4790]: E0406 12:11:09.825059 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799\": container with ID starting with c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799 not found: ID does not exist" containerID="c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.825087 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799"} err="failed to get container status \"c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799\": rpc error: code = NotFound desc = could not find container \"c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799\": container with ID starting with c940a86829b63ebced7196fbe7153622f52318f5cf928aa187bf379e4f020799 not found: ID does not exist" Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.880094 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:11:09 crc kubenswrapper[4790]: I0406 12:11:09.886165 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxrr2"] Apr 06 12:11:10 crc kubenswrapper[4790]: I0406 12:11:10.576530 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerID="b01e0d5e3a92e575caa8457a3b271166fb33a802847c7c83575d2212c9866d09" exitCode=0 Apr 06 12:11:10 crc kubenswrapper[4790]: I0406 12:11:10.576600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerDied","Data":"b01e0d5e3a92e575caa8457a3b271166fb33a802847c7c83575d2212c9866d09"} Apr 06 12:11:11 crc kubenswrapper[4790]: I0406 12:11:11.601186 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:11:11 crc kubenswrapper[4790]: I0406 12:11:11.601463 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kg8bf" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="registry-server" containerID="cri-o://5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" gracePeriod=2 Apr 06 12:11:11 crc kubenswrapper[4790]: I0406 12:11:11.700059 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" path="/var/lib/kubelet/pods/8de0f178-000e-4a30-84bc-c1f33e965d0a/volumes" Apr 06 12:11:11 crc kubenswrapper[4790]: I0406 12:11:11.926766 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:11 crc kubenswrapper[4790]: I0406 12:11:11.926844 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.002801 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.189805 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.190284 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rhjnx" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="registry-server" containerID="cri-o://4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0" gracePeriod=2 Apr 06 12:11:12 crc kubenswrapper[4790]: E0406 12:11:12.307264 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3707401_54b1_47be_9b15_87b46677513b.slice/crio-conmon-4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.386958 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.610203 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerID="5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" exitCode=0 Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.610269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerDied","Data":"5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415"} Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.612545 4790 generic.go:334] "Generic (PLEG): container finished" podID="c3707401-54b1-47be-9b15-87b46677513b" containerID="4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0" exitCode=0 Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.612582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerDied","Data":"4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0"} Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.612944 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m8wfd" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="registry-server" containerID="cri-o://4423aa43e05909e7bb3fc4ac4401ea170db5692005b2f5bd91e286f0925e45a0" gracePeriod=2 Apr 06 12:11:12 crc kubenswrapper[4790]: I0406 12:11:12.664857 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:13 crc kubenswrapper[4790]: I0406 12:11:13.629380 4790 generic.go:334] "Generic (PLEG): container finished" podID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerID="4423aa43e05909e7bb3fc4ac4401ea170db5692005b2f5bd91e286f0925e45a0" exitCode=0 Apr 06 12:11:13 crc kubenswrapper[4790]: I0406 12:11:13.629477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerDied","Data":"4423aa43e05909e7bb3fc4ac4401ea170db5692005b2f5bd91e286f0925e45a0"} Apr 06 12:11:14 crc kubenswrapper[4790]: E0406 12:11:14.121183 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415 is running failed: container process not found" containerID="5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:14 crc kubenswrapper[4790]: E0406 12:11:14.122999 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415 is running failed: container process not found" containerID="5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:14 crc kubenswrapper[4790]: E0406 12:11:14.123279 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415 is running failed: container process not found" containerID="5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" cmd=["grpc_health_probe","-addr=:50051"] Apr 06 12:11:14 crc kubenswrapper[4790]: E0406 12:11:14.123314 4790 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-kg8bf" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="registry-server" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.166480 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.266797 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.266890 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.271106 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq4pq\" (UniqueName: \"kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq\") pod \"4b74456e-c45b-4efb-9a0f-952b5663e994\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.271233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle\") pod \"4b74456e-c45b-4efb-9a0f-952b5663e994\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.272937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util\") pod \"4b74456e-c45b-4efb-9a0f-952b5663e994\" (UID: \"4b74456e-c45b-4efb-9a0f-952b5663e994\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.274066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle" (OuterVolumeSpecName: "bundle") pod "4b74456e-c45b-4efb-9a0f-952b5663e994" (UID: "4b74456e-c45b-4efb-9a0f-952b5663e994"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.277803 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq" (OuterVolumeSpecName: "kube-api-access-xq4pq") pod "4b74456e-c45b-4efb-9a0f-952b5663e994" (UID: "4b74456e-c45b-4efb-9a0f-952b5663e994"). InnerVolumeSpecName "kube-api-access-xq4pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.285252 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util" (OuterVolumeSpecName: "util") pod "4b74456e-c45b-4efb-9a0f-952b5663e994" (UID: "4b74456e-c45b-4efb-9a0f-952b5663e994"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.374055 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.374094 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq4pq\" (UniqueName: \"kubernetes.io/projected/4b74456e-c45b-4efb-9a0f-952b5663e994-kube-api-access-xq4pq\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.374109 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b74456e-c45b-4efb-9a0f-952b5663e994-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.637537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" event={"ID":"4b74456e-c45b-4efb-9a0f-952b5663e994","Type":"ContainerDied","Data":"46c4dcaf5fa2cf7967f04cd94bdb8fd5645f0b9e74303a873ebc06f16534ed55"} Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.637575 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c4dcaf5fa2cf7967f04cd94bdb8fd5645f0b9e74303a873ebc06f16534ed55" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.637631 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.787199 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.787409 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5kbct" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="registry-server" containerID="cri-o://2c5aad2da3353e51f42798337c3b0090be7f6a81b4593e1ae98befc8e49a3829" gracePeriod=2 Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.908946 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.913880 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983556 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsv4r\" (UniqueName: \"kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r\") pod \"1a9985d3-5261-42f1-9db8-30fd5b560d36\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983610 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-988gl\" (UniqueName: \"kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl\") pod \"c3707401-54b1-47be-9b15-87b46677513b\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983675 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content\") pod \"1a9985d3-5261-42f1-9db8-30fd5b560d36\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities\") pod \"1a9985d3-5261-42f1-9db8-30fd5b560d36\" (UID: \"1a9985d3-5261-42f1-9db8-30fd5b560d36\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content\") pod \"c3707401-54b1-47be-9b15-87b46677513b\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.983758 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities\") pod \"c3707401-54b1-47be-9b15-87b46677513b\" (UID: \"c3707401-54b1-47be-9b15-87b46677513b\") " Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.984369 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities" (OuterVolumeSpecName: "utilities") pod "1a9985d3-5261-42f1-9db8-30fd5b560d36" (UID: "1a9985d3-5261-42f1-9db8-30fd5b560d36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.985888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities" (OuterVolumeSpecName: "utilities") pod "c3707401-54b1-47be-9b15-87b46677513b" (UID: "c3707401-54b1-47be-9b15-87b46677513b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:14 crc kubenswrapper[4790]: I0406 12:11:14.990144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl" (OuterVolumeSpecName: "kube-api-access-988gl") pod "c3707401-54b1-47be-9b15-87b46677513b" (UID: "c3707401-54b1-47be-9b15-87b46677513b"). InnerVolumeSpecName "kube-api-access-988gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.002210 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r" (OuterVolumeSpecName: "kube-api-access-tsv4r") pod "1a9985d3-5261-42f1-9db8-30fd5b560d36" (UID: "1a9985d3-5261-42f1-9db8-30fd5b560d36"). InnerVolumeSpecName "kube-api-access-tsv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.044699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a9985d3-5261-42f1-9db8-30fd5b560d36" (UID: "1a9985d3-5261-42f1-9db8-30fd5b560d36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.051651 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3707401-54b1-47be-9b15-87b46677513b" (UID: "c3707401-54b1-47be-9b15-87b46677513b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085495 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsv4r\" (UniqueName: \"kubernetes.io/projected/1a9985d3-5261-42f1-9db8-30fd5b560d36-kube-api-access-tsv4r\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085528 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-988gl\" (UniqueName: \"kubernetes.io/projected/c3707401-54b1-47be-9b15-87b46677513b-kube-api-access-988gl\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085539 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085549 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9985d3-5261-42f1-9db8-30fd5b560d36-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085559 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.085566 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3707401-54b1-47be-9b15-87b46677513b-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.652168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhjnx" event={"ID":"c3707401-54b1-47be-9b15-87b46677513b","Type":"ContainerDied","Data":"f0daeab29f1e8eb97e3ac45c46968095795fefb7ce7b054273a150fd66af355b"} Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.652234 4790 scope.go:117] "RemoveContainer" containerID="4b111a2ac085caea8e862b861a7162a335216a7c94fa17dd5547e07bd681f9e0" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.652432 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhjnx" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.660938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kg8bf" event={"ID":"1a9985d3-5261-42f1-9db8-30fd5b560d36","Type":"ContainerDied","Data":"277b6635b4130ba8e73d546bc4ca84d0920a25205b6109d5e5a329d6123c2bf4"} Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.661119 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kg8bf" Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.666970 4790 generic.go:334] "Generic (PLEG): container finished" podID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerID="2c5aad2da3353e51f42798337c3b0090be7f6a81b4593e1ae98befc8e49a3829" exitCode=0 Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.667027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerDied","Data":"2c5aad2da3353e51f42798337c3b0090be7f6a81b4593e1ae98befc8e49a3829"} Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.693003 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.700779 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rhjnx"] Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.715670 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:11:15 crc kubenswrapper[4790]: I0406 12:11:15.717366 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kg8bf"] Apr 06 12:11:17 crc kubenswrapper[4790]: I0406 12:11:17.682276 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" path="/var/lib/kubelet/pods/1a9985d3-5261-42f1-9db8-30fd5b560d36/volumes" Apr 06 12:11:17 crc kubenswrapper[4790]: I0406 12:11:17.683234 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3707401-54b1-47be-9b15-87b46677513b" path="/var/lib/kubelet/pods/c3707401-54b1-47be-9b15-87b46677513b/volumes" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.358604 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.371677 4790 scope.go:117] "RemoveContainer" containerID="aa8e309426fc38558f3e045326e3118615c10de3164040c1c9c0ee7ea1732f38" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.429429 4790 scope.go:117] "RemoveContainer" containerID="fab6cd1c88d0a2e5bba4078a9eb43dcbb85be65923975800b4c3b3b190ae5696" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.431958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content\") pod \"58652e08-5d4e-4a76-b00d-79b1934afeb2\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.432013 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmsz4\" (UniqueName: \"kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4\") pod \"58652e08-5d4e-4a76-b00d-79b1934afeb2\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.432085 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities\") pod \"58652e08-5d4e-4a76-b00d-79b1934afeb2\" (UID: \"58652e08-5d4e-4a76-b00d-79b1934afeb2\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.434211 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities" (OuterVolumeSpecName: "utilities") pod "58652e08-5d4e-4a76-b00d-79b1934afeb2" (UID: "58652e08-5d4e-4a76-b00d-79b1934afeb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.437971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4" (OuterVolumeSpecName: "kube-api-access-xmsz4") pod "58652e08-5d4e-4a76-b00d-79b1934afeb2" (UID: "58652e08-5d4e-4a76-b00d-79b1934afeb2"). InnerVolumeSpecName "kube-api-access-xmsz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.481277 4790 scope.go:117] "RemoveContainer" containerID="5d42f9dec71478bce4430d5443ff85c6d1f88e6579dd23369a4e1785e7048415" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.510378 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58652e08-5d4e-4a76-b00d-79b1934afeb2" (UID: "58652e08-5d4e-4a76-b00d-79b1934afeb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.533969 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.533995 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58652e08-5d4e-4a76-b00d-79b1934afeb2-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.534005 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmsz4\" (UniqueName: \"kubernetes.io/projected/58652e08-5d4e-4a76-b00d-79b1934afeb2-kube-api-access-xmsz4\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.550641 4790 scope.go:117] "RemoveContainer" containerID="2351d42bea4af5ea2f26e91ad01e35e1213a498fb9755a1fb65daf6e7b2dd4cd" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.553003 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.634968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content\") pod \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.635108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities\") pod \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.635172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56cxp\" (UniqueName: \"kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp\") pod \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\" (UID: \"1aae3cb7-5ff6-4400-a80e-42f87496da8b\") " Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.635866 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities" (OuterVolumeSpecName: "utilities") pod "1aae3cb7-5ff6-4400-a80e-42f87496da8b" (UID: "1aae3cb7-5ff6-4400-a80e-42f87496da8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.645001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp" (OuterVolumeSpecName: "kube-api-access-56cxp") pod "1aae3cb7-5ff6-4400-a80e-42f87496da8b" (UID: "1aae3cb7-5ff6-4400-a80e-42f87496da8b"). InnerVolumeSpecName "kube-api-access-56cxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.665364 4790 scope.go:117] "RemoveContainer" containerID="ce1be94afdbfbe8d1b03c4c141a5c9bde276174c78ecea964eda5ec37736f3a5" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.715031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aae3cb7-5ff6-4400-a80e-42f87496da8b" (UID: "1aae3cb7-5ff6-4400-a80e-42f87496da8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.731700 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" event={"ID":"ea4918c9-2a05-4c75-9d68-662e0a0fc175","Type":"ContainerStarted","Data":"a17e059174f79b159e4af6c74112167e52ace4278fc2ce086545e7b6c403c4bd"} Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.750601 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.750932 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56cxp\" (UniqueName: \"kubernetes.io/projected/1aae3cb7-5ff6-4400-a80e-42f87496da8b-kube-api-access-56cxp\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.750944 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aae3cb7-5ff6-4400-a80e-42f87496da8b-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.770584 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw" podStartSLOduration=1.87008522 podStartE2EDuration="13.770561063s" podCreationTimestamp="2026-04-06 12:11:05 +0000 UTC" firstStartedPulling="2026-04-06 12:11:06.494752127 +0000 UTC m=+845.482494993" lastFinishedPulling="2026-04-06 12:11:18.39522796 +0000 UTC m=+857.382970836" observedRunningTime="2026-04-06 12:11:18.765517706 +0000 UTC m=+857.753260592" watchObservedRunningTime="2026-04-06 12:11:18.770561063 +0000 UTC m=+857.758303929" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.772413 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m8wfd" event={"ID":"58652e08-5d4e-4a76-b00d-79b1934afeb2","Type":"ContainerDied","Data":"9e6fb38775f2b2ea08133adb53d98eb34eaff20e38bb6ec7639c18813c1ce6ec"} Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.772466 4790 scope.go:117] "RemoveContainer" containerID="4423aa43e05909e7bb3fc4ac4401ea170db5692005b2f5bd91e286f0925e45a0" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.772601 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m8wfd" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.811949 4790 scope.go:117] "RemoveContainer" containerID="ef2c129f0eb4cbfb416bbc994b4d9b271f4c6a634c68f683766ecaa34b04938f" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.812014 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.812788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" event={"ID":"0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686","Type":"ContainerStarted","Data":"97a6955e8e26f9225ce86661c63d4612cd9e4c59b007102fde3605a82c04c380"} Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.813845 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.816313 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m8wfd"] Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.822845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kbct" event={"ID":"1aae3cb7-5ff6-4400-a80e-42f87496da8b","Type":"ContainerDied","Data":"aa4e31fd2f89394cd37441d869b34e50d813bba8a7f73336339ad9d5c8cbabb6"} Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.822974 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kbct" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.841530 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" podStartSLOduration=2.425638404 podStartE2EDuration="13.841508886s" podCreationTimestamp="2026-04-06 12:11:05 +0000 UTC" firstStartedPulling="2026-04-06 12:11:06.97980121 +0000 UTC m=+845.967544066" lastFinishedPulling="2026-04-06 12:11:18.395671682 +0000 UTC m=+857.383414548" observedRunningTime="2026-04-06 12:11:18.831950096 +0000 UTC m=+857.819692962" watchObservedRunningTime="2026-04-06 12:11:18.841508886 +0000 UTC m=+857.829251752" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.847153 4790 scope.go:117] "RemoveContainer" containerID="2374825ead498d23cc2caf030f89c1dc980b99577763f94ec5e83c236d2c0e8f" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.858983 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" podStartSLOduration=1.617353286 podStartE2EDuration="13.858964051s" podCreationTimestamp="2026-04-06 12:11:05 +0000 UTC" firstStartedPulling="2026-04-06 12:11:06.153763969 +0000 UTC m=+845.141506835" lastFinishedPulling="2026-04-06 12:11:18.395374744 +0000 UTC m=+857.383117600" observedRunningTime="2026-04-06 12:11:18.855048815 +0000 UTC m=+857.842791681" watchObservedRunningTime="2026-04-06 12:11:18.858964051 +0000 UTC m=+857.846706917" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.874379 4790 scope.go:117] "RemoveContainer" containerID="2c5aad2da3353e51f42798337c3b0090be7f6a81b4593e1ae98befc8e49a3829" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.893954 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.900670 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kbct"] Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.903126 4790 scope.go:117] "RemoveContainer" containerID="0c4b252c65a7fc09f5c2e6d5d207255192d49ca8cbf632de76aed097cfd98e67" Apr 06 12:11:18 crc kubenswrapper[4790]: I0406 12:11:18.916548 4790 scope.go:117] "RemoveContainer" containerID="bb1bbf5eb0dd117774b710488e1ef80abb88f2e79f5029a571a28f1fc9c49a79" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.267723 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.267787 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.689375 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" path="/var/lib/kubelet/pods/1aae3cb7-5ff6-4400-a80e-42f87496da8b/volumes" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.690203 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" path="/var/lib/kubelet/pods/58652e08-5d4e-4a76-b00d-79b1934afeb2/volumes" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.834786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" event={"ID":"06f8ee69-3814-40ed-8ed2-5913509658de","Type":"ContainerStarted","Data":"b3fb342e0e7b15fd5c2620c011ce238f15dea3bfec1dff3d36d6a70bcae9090f"} Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.836957 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-dd944d769-7fscl" event={"ID":"d990eb66-396e-4b05-acab-eaa30a6fbd34","Type":"ContainerStarted","Data":"67eef5390bbb70652db97cbd2b0c0541e65a529bbf22f5abb8f3a201679c68af"} Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.837146 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.839288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-dd944d769-7fscl" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.840891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-86dff4bf76-bm9zd" event={"ID":"ae41d528-6f4b-45e5-84f2-5d9eae998759","Type":"ContainerStarted","Data":"1db2b64fdcbdee27415c8c7e0cf9057b641d223979d56ac6603f10d779aeafe2"} Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.855362 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-795cd6b797-pw54j" podStartSLOduration=2.989105302 podStartE2EDuration="14.855342902s" podCreationTimestamp="2026-04-06 12:11:05 +0000 UTC" firstStartedPulling="2026-04-06 12:11:06.585134629 +0000 UTC m=+845.572877495" lastFinishedPulling="2026-04-06 12:11:18.451372209 +0000 UTC m=+857.439115095" observedRunningTime="2026-04-06 12:11:19.853592544 +0000 UTC m=+858.841335410" watchObservedRunningTime="2026-04-06 12:11:19.855342902 +0000 UTC m=+858.843085768" Apr 06 12:11:19 crc kubenswrapper[4790]: I0406 12:11:19.873203 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-dd944d769-7fscl" podStartSLOduration=3.122144316 podStartE2EDuration="14.873186338s" podCreationTimestamp="2026-04-06 12:11:05 +0000 UTC" firstStartedPulling="2026-04-06 12:11:06.71145662 +0000 UTC m=+845.699199476" lastFinishedPulling="2026-04-06 12:11:18.462498632 +0000 UTC m=+857.450241498" observedRunningTime="2026-04-06 12:11:19.871149682 +0000 UTC m=+858.858892548" watchObservedRunningTime="2026-04-06 12:11:19.873186338 +0000 UTC m=+858.860929204" Apr 06 12:11:24 crc kubenswrapper[4790]: I0406 12:11:24.267351 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:24 crc kubenswrapper[4790]: I0406 12:11:24.268198 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:26 crc kubenswrapper[4790]: I0406 12:11:26.272133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-74445bf4b8-spbpr" Apr 06 12:11:29 crc kubenswrapper[4790]: I0406 12:11:29.267294 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:29 crc kubenswrapper[4790]: I0406 12:11:29.267361 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:34 crc kubenswrapper[4790]: I0406 12:11:34.267297 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:34 crc kubenswrapper[4790]: I0406 12:11:34.268852 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.267437 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.267922 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.753693 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.753769 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.753848 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.754679 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.754754 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622" gracePeriod=600 Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.976678 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622" exitCode=0 Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.976785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622"} Apr 06 12:11:39 crc kubenswrapper[4790]: I0406 12:11:39.977164 4790 scope.go:117] "RemoveContainer" containerID="4b2a44c53b178b9bcd5b5bef115e30badb21f6cf04ef10c42dd5c6e3679e16df" Apr 06 12:11:40 crc kubenswrapper[4790]: I0406 12:11:40.988720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43"} Apr 06 12:11:44 crc kubenswrapper[4790]: I0406 12:11:44.268418 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:44 crc kubenswrapper[4790]: I0406 12:11:44.268949 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.268281 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sl7ql container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.269059 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": dial tcp 10.217.0.7:8443: connect: connection refused" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.826146 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877269 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-6fff9df955-tlfh5"] Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877556 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877573 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877590 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877599 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877614 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877624 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877636 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877643 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877652 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877660 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877670 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="pull" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877678 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="pull" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877686 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877694 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877702 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877710 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877723 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877730 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877784 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877793 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877807 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver-check-endpoints" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877816 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver-check-endpoints" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877851 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877865 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877876 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877885 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877897 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877906 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877918 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877926 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877971 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877980 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.877988 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.877996 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878026 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878033 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878081 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="extract-utilities" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878110 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878118 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878129 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878136 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878144 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878152 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878165 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878173 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878185 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878193 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878203 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878211 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878222 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="extract" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878230 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="extract" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878239 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878248 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878258 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878267 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="extract-content" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878279 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878298 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="fix-audit-permissions" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878305 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="fix-audit-permissions" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878315 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878323 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878330 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="util" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878338 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="util" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878361 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: E0406 12:11:49.878375 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878384 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878506 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1fd968f-f454-4241-afa5-e47623a01c84" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878519 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7781c53a-5217-44ce-9a47-671e951b9c7e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878530 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de0f178-000e-4a30-84bc-c1f33e965d0a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878539 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff0634b-ac6b-4c16-a72f-9bf81b54b1af" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878550 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver-check-endpoints" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878563 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerName="openshift-apiserver" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878577 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="42617262-5255-4f88-8afa-9d93a640090e" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878586 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="58652e08-5d4e-4a76-b00d-79b1934afeb2" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878596 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c342bf5-bb1e-43e0-a549-e73112fa4e4a" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878604 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aae3cb7-5ff6-4400-a80e-42f87496da8b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878614 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b74456e-c45b-4efb-9a0f-952b5663e994" containerName="extract" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878625 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9985d3-5261-42f1-9db8-30fd5b560d36" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.878637 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3707401-54b1-47be-9b15-87b46677513b" containerName="registry-server" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.879990 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.883393 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6fff9df955-tlfh5"] Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930065 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r477f\" (UniqueName: \"kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930142 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930222 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.930242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931158 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit" (OuterVolumeSpecName: "audit") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931152 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931195 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931226 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931316 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca\") pod \"426ca9b3-5c58-4b68-a29e-207207bf6897\" (UID: \"426ca9b3-5c58-4b68-a29e-207207bf6897\") " Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-serving-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit-dir\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931523 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l925v\" (UniqueName: \"kubernetes.io/projected/5322c6c4-c175-4ecf-a6bb-078b514585d2-kube-api-access-l925v\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-trusted-ca-bundle\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931576 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-serving-cert\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931592 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-node-pullsecrets\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931609 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-encryption-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-client\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-image-import-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931878 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-audit\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931890 4790 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931901 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/426ca9b3-5c58-4b68-a29e-207207bf6897-audit-dir\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.931910 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.932252 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.932265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.932308 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config" (OuterVolumeSpecName: "config") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.936474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.936629 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.942498 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:11:49 crc kubenswrapper[4790]: I0406 12:11:49.942508 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f" (OuterVolumeSpecName: "kube-api-access-r477f") pod "426ca9b3-5c58-4b68-a29e-207207bf6897" (UID: "426ca9b3-5c58-4b68-a29e-207207bf6897"). InnerVolumeSpecName "kube-api-access-r477f". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-image-import-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-serving-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit-dir\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l925v\" (UniqueName: \"kubernetes.io/projected/5322c6c4-c175-4ecf-a6bb-078b514585d2-kube-api-access-l925v\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-trusted-ca-bundle\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032692 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-serving-cert\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032709 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-node-pullsecrets\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-encryption-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-client\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032808 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-encryption-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032819 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032847 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032859 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032869 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/426ca9b3-5c58-4b68-a29e-207207bf6897-etcd-client\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032879 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/426ca9b3-5c58-4b68-a29e-207207bf6897-image-import-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032890 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r477f\" (UniqueName: \"kubernetes.io/projected/426ca9b3-5c58-4b68-a29e-207207bf6897-kube-api-access-r477f\") on node \"crc\" DevicePath \"\"" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.032949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-node-pullsecrets\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.033133 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit-dir\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.033659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.033677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-serving-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.034147 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-audit\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.035058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-image-import-ca\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.036149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5322c6c4-c175-4ecf-a6bb-078b514585d2-trusted-ca-bundle\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.040434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-encryption-config\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.040460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-etcd-client\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.040610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5322c6c4-c175-4ecf-a6bb-078b514585d2-serving-cert\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.052516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l925v\" (UniqueName: \"kubernetes.io/projected/5322c6c4-c175-4ecf-a6bb-078b514585d2-kube-api-access-l925v\") pod \"apiserver-6fff9df955-tlfh5\" (UID: \"5322c6c4-c175-4ecf-a6bb-078b514585d2\") " pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.082084 4790 generic.go:334] "Generic (PLEG): container finished" podID="426ca9b3-5c58-4b68-a29e-207207bf6897" containerID="4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61" exitCode=0 Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.082123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerDied","Data":"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61"} Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.082168 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.082190 4790 scope.go:117] "RemoveContainer" containerID="13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.082176 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sl7ql" event={"ID":"426ca9b3-5c58-4b68-a29e-207207bf6897","Type":"ContainerDied","Data":"a75482bd7d041ff41ef3239e83873f0fa38bb9693146d23461f1092c8b29a530"} Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.100721 4790 scope.go:117] "RemoveContainer" containerID="4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.120486 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.127083 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sl7ql"] Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.133717 4790 scope.go:117] "RemoveContainer" containerID="8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.153127 4790 scope.go:117] "RemoveContainer" containerID="13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24" Apr 06 12:11:50 crc kubenswrapper[4790]: E0406 12:11:50.153638 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24\": container with ID starting with 13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24 not found: ID does not exist" containerID="13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.153694 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24"} err="failed to get container status \"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24\": rpc error: code = NotFound desc = could not find container \"13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24\": container with ID starting with 13ef529b123263722b72b6c938a9f72c16705c3288ab8cbafa95e6fac2a31a24 not found: ID does not exist" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.153729 4790 scope.go:117] "RemoveContainer" containerID="4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61" Apr 06 12:11:50 crc kubenswrapper[4790]: E0406 12:11:50.154161 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61\": container with ID starting with 4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61 not found: ID does not exist" containerID="4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.154197 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61"} err="failed to get container status \"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61\": rpc error: code = NotFound desc = could not find container \"4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61\": container with ID starting with 4409923626e612ffc7b380fc6de4bd255a919209617bed342959488e801a3b61 not found: ID does not exist" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.154223 4790 scope.go:117] "RemoveContainer" containerID="8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa" Apr 06 12:11:50 crc kubenswrapper[4790]: E0406 12:11:50.154637 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa\": container with ID starting with 8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa not found: ID does not exist" containerID="8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.154680 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa"} err="failed to get container status \"8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa\": rpc error: code = NotFound desc = could not find container \"8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa\": container with ID starting with 8ea86e1d02f1d364821661babc9bb578e429bc2a7872eed0551e5e48ee01c6fa not found: ID does not exist" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.194437 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:50 crc kubenswrapper[4790]: I0406 12:11:50.414432 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-6fff9df955-tlfh5"] Apr 06 12:11:51 crc kubenswrapper[4790]: I0406 12:11:51.089954 4790 generic.go:334] "Generic (PLEG): container finished" podID="5322c6c4-c175-4ecf-a6bb-078b514585d2" containerID="19e15e10e05711a93faef0a320d2b2f95e0e96c55d94f2397d3409879e81b56e" exitCode=0 Apr 06 12:11:51 crc kubenswrapper[4790]: I0406 12:11:51.090042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" event={"ID":"5322c6c4-c175-4ecf-a6bb-078b514585d2","Type":"ContainerDied","Data":"19e15e10e05711a93faef0a320d2b2f95e0e96c55d94f2397d3409879e81b56e"} Apr 06 12:11:51 crc kubenswrapper[4790]: I0406 12:11:51.090257 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" event={"ID":"5322c6c4-c175-4ecf-a6bb-078b514585d2","Type":"ContainerStarted","Data":"77515bcf060396b9e25c7bbf2680111451b8b33efc38a9779efdaa3fe8906b8f"} Apr 06 12:11:51 crc kubenswrapper[4790]: I0406 12:11:51.684091 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426ca9b3-5c58-4b68-a29e-207207bf6897" path="/var/lib/kubelet/pods/426ca9b3-5c58-4b68-a29e-207207bf6897/volumes" Apr 06 12:11:52 crc kubenswrapper[4790]: I0406 12:11:52.105366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" event={"ID":"5322c6c4-c175-4ecf-a6bb-078b514585d2","Type":"ContainerStarted","Data":"f6db90c252d57fd61c6d6fb67715a3923f3aa8c580bd4b417672a98b406ba6b4"} Apr 06 12:11:52 crc kubenswrapper[4790]: I0406 12:11:52.105415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" event={"ID":"5322c6c4-c175-4ecf-a6bb-078b514585d2","Type":"ContainerStarted","Data":"f4224c5001eda39d9f300be849f89c192612784d8524a51c2aca4c5808696535"} Apr 06 12:11:52 crc kubenswrapper[4790]: I0406 12:11:52.139197 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" podStartSLOduration=113.139181587 podStartE2EDuration="1m53.139181587s" podCreationTimestamp="2026-04-06 12:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:11:52.132371571 +0000 UTC m=+891.120114437" watchObservedRunningTime="2026-04-06 12:11:52.139181587 +0000 UTC m=+891.126924453" Apr 06 12:11:55 crc kubenswrapper[4790]: I0406 12:11:55.195122 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:55 crc kubenswrapper[4790]: I0406 12:11:55.195643 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:55 crc kubenswrapper[4790]: I0406 12:11:55.205244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:56 crc kubenswrapper[4790]: I0406 12:11:56.149771 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-6fff9df955-tlfh5" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.308594 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6"] Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.310291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.312379 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.325134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6"] Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.393702 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.394069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.394230 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svxjp\" (UniqueName: \"kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.495353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.495744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.495901 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svxjp\" (UniqueName: \"kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.496226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.496301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.520153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svxjp\" (UniqueName: \"kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp\") pod \"5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.625151 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:11:59 crc kubenswrapper[4790]: I0406 12:11:59.839319 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6"] Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.127721 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591292-h54f6"] Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.132026 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.132327 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591292-h54f6"] Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.136542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.136666 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.136744 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.183235 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerID="1de3d0ec126e7c5c5b9819868b91d5999e7628c0dd079cf2f9a0007e7c809a79" exitCode=0 Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.183299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" event={"ID":"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4","Type":"ContainerDied","Data":"1de3d0ec126e7c5c5b9819868b91d5999e7628c0dd079cf2f9a0007e7c809a79"} Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.183372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" event={"ID":"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4","Type":"ContainerStarted","Data":"10377375615e99dea1b92762764b6db29bf9255d2b3ea30c3a377dc6552e303b"} Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.203700 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqxv\" (UniqueName: \"kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv\") pod \"auto-csr-approver-29591292-h54f6\" (UID: \"a167fd67-1708-4c3b-a2f2-429b58ce961a\") " pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.305283 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqxv\" (UniqueName: \"kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv\") pod \"auto-csr-approver-29591292-h54f6\" (UID: \"a167fd67-1708-4c3b-a2f2-429b58ce961a\") " pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.324561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqxv\" (UniqueName: \"kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv\") pod \"auto-csr-approver-29591292-h54f6\" (UID: \"a167fd67-1708-4c3b-a2f2-429b58ce961a\") " pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.494303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:00 crc kubenswrapper[4790]: I0406 12:12:00.672050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591292-h54f6"] Apr 06 12:12:00 crc kubenswrapper[4790]: W0406 12:12:00.686677 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda167fd67_1708_4c3b_a2f2_429b58ce961a.slice/crio-57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0 WatchSource:0}: Error finding container 57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0: Status 404 returned error can't find the container with id 57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0 Apr 06 12:12:01 crc kubenswrapper[4790]: I0406 12:12:01.193205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591292-h54f6" event={"ID":"a167fd67-1708-4c3b-a2f2-429b58ce961a","Type":"ContainerStarted","Data":"57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0"} Apr 06 12:12:02 crc kubenswrapper[4790]: I0406 12:12:02.201663 4790 generic.go:334] "Generic (PLEG): container finished" podID="a167fd67-1708-4c3b-a2f2-429b58ce961a" containerID="f4856bc9c4f4020355729a7616098568851b3bec940fb783dbbbd66baf4c462b" exitCode=0 Apr 06 12:12:02 crc kubenswrapper[4790]: I0406 12:12:02.201713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591292-h54f6" event={"ID":"a167fd67-1708-4c3b-a2f2-429b58ce961a","Type":"ContainerDied","Data":"f4856bc9c4f4020355729a7616098568851b3bec940fb783dbbbd66baf4c462b"} Apr 06 12:12:02 crc kubenswrapper[4790]: I0406 12:12:02.204432 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerID="20c1776ce15a5b00a3404843a640bfd33b46ec81e107babde2c6826a40036f58" exitCode=0 Apr 06 12:12:02 crc kubenswrapper[4790]: I0406 12:12:02.204464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" event={"ID":"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4","Type":"ContainerDied","Data":"20c1776ce15a5b00a3404843a640bfd33b46ec81e107babde2c6826a40036f58"} Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.214048 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerID="62d99866abf98281a5166a693fcf46b9672f438c280394e7d0802c96539d260f" exitCode=0 Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.214343 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" event={"ID":"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4","Type":"ContainerDied","Data":"62d99866abf98281a5166a693fcf46b9672f438c280394e7d0802c96539d260f"} Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.436646 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.551744 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pqxv\" (UniqueName: \"kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv\") pod \"a167fd67-1708-4c3b-a2f2-429b58ce961a\" (UID: \"a167fd67-1708-4c3b-a2f2-429b58ce961a\") " Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.557491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv" (OuterVolumeSpecName: "kube-api-access-6pqxv") pod "a167fd67-1708-4c3b-a2f2-429b58ce961a" (UID: "a167fd67-1708-4c3b-a2f2-429b58ce961a"). InnerVolumeSpecName "kube-api-access-6pqxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:12:03 crc kubenswrapper[4790]: I0406 12:12:03.653359 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pqxv\" (UniqueName: \"kubernetes.io/projected/a167fd67-1708-4c3b-a2f2-429b58ce961a-kube-api-access-6pqxv\") on node \"crc\" DevicePath \"\"" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.227580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591292-h54f6" event={"ID":"a167fd67-1708-4c3b-a2f2-429b58ce961a","Type":"ContainerDied","Data":"57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0"} Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.227823 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f2befadf0e91f823ef748c7f5658e6990b20a74aa0a1ee1352e9f700ff46c0" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.227637 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591292-h54f6" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.497975 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591286-nlv9t"] Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.504805 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591286-nlv9t"] Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.549409 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.666678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svxjp\" (UniqueName: \"kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp\") pod \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.666812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle\") pod \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.666914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util\") pod \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\" (UID: \"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4\") " Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.667816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle" (OuterVolumeSpecName: "bundle") pod "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" (UID: "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.670560 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp" (OuterVolumeSpecName: "kube-api-access-svxjp") pod "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" (UID: "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4"). InnerVolumeSpecName "kube-api-access-svxjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.682814 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util" (OuterVolumeSpecName: "util") pod "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" (UID: "8a7a0dc8-d9f5-4844-bd9a-5377990f86c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.768022 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.768054 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svxjp\" (UniqueName: \"kubernetes.io/projected/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-kube-api-access-svxjp\") on node \"crc\" DevicePath \"\"" Apr 06 12:12:04 crc kubenswrapper[4790]: I0406 12:12:04.768064 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a7a0dc8-d9f5-4844-bd9a-5377990f86c4-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:12:05 crc kubenswrapper[4790]: I0406 12:12:05.239422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" event={"ID":"8a7a0dc8-d9f5-4844-bd9a-5377990f86c4","Type":"ContainerDied","Data":"10377375615e99dea1b92762764b6db29bf9255d2b3ea30c3a377dc6552e303b"} Apr 06 12:12:05 crc kubenswrapper[4790]: I0406 12:12:05.239506 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10377375615e99dea1b92762764b6db29bf9255d2b3ea30c3a377dc6552e303b" Apr 06 12:12:05 crc kubenswrapper[4790]: I0406 12:12:05.239543 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6" Apr 06 12:12:05 crc kubenswrapper[4790]: I0406 12:12:05.684336 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124e10ce-efae-4551-adc3-aadefc84b7a1" path="/var/lib/kubelet/pods/124e10ce-efae-4551-adc3-aadefc84b7a1/volumes" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.969845 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-srf9k"] Apr 06 12:12:07 crc kubenswrapper[4790]: E0406 12:12:07.970071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="pull" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970084 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="pull" Apr 06 12:12:07 crc kubenswrapper[4790]: E0406 12:12:07.970093 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="extract" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="extract" Apr 06 12:12:07 crc kubenswrapper[4790]: E0406 12:12:07.970117 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a167fd67-1708-4c3b-a2f2-429b58ce961a" containerName="oc" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970124 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a167fd67-1708-4c3b-a2f2-429b58ce961a" containerName="oc" Apr 06 12:12:07 crc kubenswrapper[4790]: E0406 12:12:07.970135 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="util" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970142 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="util" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970237 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a167fd67-1708-4c3b-a2f2-429b58ce961a" containerName="oc" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970251 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a7a0dc8-d9f5-4844-bd9a-5377990f86c4" containerName="extract" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.970724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.973373 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zzf4r" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.973562 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.976039 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.981525 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-srf9k"] Apr 06 12:12:07 crc kubenswrapper[4790]: I0406 12:12:07.992619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9g4r\" (UniqueName: \"kubernetes.io/projected/8852468a-2987-493e-bdd3-1a0e0b3b0721-kube-api-access-q9g4r\") pod \"nmstate-operator-6b8c6447b-srf9k\" (UID: \"8852468a-2987-493e-bdd3-1a0e0b3b0721\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" Apr 06 12:12:08 crc kubenswrapper[4790]: I0406 12:12:08.094149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9g4r\" (UniqueName: \"kubernetes.io/projected/8852468a-2987-493e-bdd3-1a0e0b3b0721-kube-api-access-q9g4r\") pod \"nmstate-operator-6b8c6447b-srf9k\" (UID: \"8852468a-2987-493e-bdd3-1a0e0b3b0721\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" Apr 06 12:12:08 crc kubenswrapper[4790]: I0406 12:12:08.116326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9g4r\" (UniqueName: \"kubernetes.io/projected/8852468a-2987-493e-bdd3-1a0e0b3b0721-kube-api-access-q9g4r\") pod \"nmstate-operator-6b8c6447b-srf9k\" (UID: \"8852468a-2987-493e-bdd3-1a0e0b3b0721\") " pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" Apr 06 12:12:08 crc kubenswrapper[4790]: I0406 12:12:08.286437 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" Apr 06 12:12:08 crc kubenswrapper[4790]: I0406 12:12:08.488224 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6b8c6447b-srf9k"] Apr 06 12:12:09 crc kubenswrapper[4790]: I0406 12:12:09.287073 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" event={"ID":"8852468a-2987-493e-bdd3-1a0e0b3b0721","Type":"ContainerStarted","Data":"02487be5616e9f042923964c7e390fc17cd1736f591cc9a6c63c05764d45f680"} Apr 06 12:12:11 crc kubenswrapper[4790]: I0406 12:12:11.318502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" event={"ID":"8852468a-2987-493e-bdd3-1a0e0b3b0721","Type":"ContainerStarted","Data":"a42f6b2daeec45a51d5f3e771c5aa50d5533608680d3f16bf84db8cd4d842b63"} Apr 06 12:12:11 crc kubenswrapper[4790]: I0406 12:12:11.343212 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6b8c6447b-srf9k" podStartSLOduration=2.2428869479999998 podStartE2EDuration="4.343187729s" podCreationTimestamp="2026-04-06 12:12:07 +0000 UTC" firstStartedPulling="2026-04-06 12:12:08.498672906 +0000 UTC m=+907.486415772" lastFinishedPulling="2026-04-06 12:12:10.598973687 +0000 UTC m=+909.586716553" observedRunningTime="2026-04-06 12:12:11.335631582 +0000 UTC m=+910.323374498" watchObservedRunningTime="2026-04-06 12:12:11.343187729 +0000 UTC m=+910.330930635" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.786128 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-khm89"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.788054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.790006 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tldvz" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.796314 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.797680 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.800541 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.809731 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.815155 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-khm89"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.846011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsm6k\" (UniqueName: \"kubernetes.io/projected/8691bf62-154a-4c5b-8d00-066f07c030fa-kube-api-access-jsm6k\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.846064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.846157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4nn\" (UniqueName: \"kubernetes.io/projected/baf52e27-af4b-4863-9e09-a3f11f497db9-kube-api-access-9c4nn\") pod \"nmstate-metrics-9b8c8685d-khm89\" (UID: \"baf52e27-af4b-4863-9e09-a3f11f497db9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.859811 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rdg86"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.860543 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.914200 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.915219 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.917426 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rc272" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.917650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.917902 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.934125 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5"] Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.946970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4nn\" (UniqueName: \"kubernetes.io/projected/baf52e27-af4b-4863-9e09-a3f11f497db9-kube-api-access-9c4nn\") pod \"nmstate-metrics-9b8c8685d-khm89\" (UID: \"baf52e27-af4b-4863-9e09-a3f11f497db9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2l6v\" (UniqueName: \"kubernetes.io/projected/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-kube-api-access-n2l6v\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947067 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsm6k\" (UniqueName: \"kubernetes.io/projected/8691bf62-154a-4c5b-8d00-066f07c030fa-kube-api-access-jsm6k\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947092 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-dbus-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-ovs-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8nf8\" (UniqueName: \"kubernetes.io/projected/fb4a8135-7355-406b-a851-dce0109face5-kube-api-access-l8nf8\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.947205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-nmstate-lock\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:33 crc kubenswrapper[4790]: E0406 12:12:33.947254 4790 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Apr 06 12:12:33 crc kubenswrapper[4790]: E0406 12:12:33.947324 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair podName:8691bf62-154a-4c5b-8d00-066f07c030fa nodeName:}" failed. No retries permitted until 2026-04-06 12:12:34.447304624 +0000 UTC m=+933.435047490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair") pod "nmstate-webhook-5f558f5558-5vn4v" (UID: "8691bf62-154a-4c5b-8d00-066f07c030fa") : secret "openshift-nmstate-webhook" not found Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.968111 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsm6k\" (UniqueName: \"kubernetes.io/projected/8691bf62-154a-4c5b-8d00-066f07c030fa-kube-api-access-jsm6k\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:33 crc kubenswrapper[4790]: I0406 12:12:33.969552 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4nn\" (UniqueName: \"kubernetes.io/projected/baf52e27-af4b-4863-9e09-a3f11f497db9-kube-api-access-9c4nn\") pod \"nmstate-metrics-9b8c8685d-khm89\" (UID: \"baf52e27-af4b-4863-9e09-a3f11f497db9\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048486 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2l6v\" (UniqueName: \"kubernetes.io/projected/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-kube-api-access-n2l6v\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048595 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-dbus-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048722 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-ovs-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8nf8\" (UniqueName: \"kubernetes.io/projected/fb4a8135-7355-406b-a851-dce0109face5-kube-api-access-l8nf8\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-nmstate-lock\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.048874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.049075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-ovs-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.049099 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-nmstate-lock\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.049376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb4a8135-7355-406b-a851-dce0109face5-dbus-socket\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: E0406 12:12:34.049558 4790 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 06 12:12:34 crc kubenswrapper[4790]: E0406 12:12:34.049676 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert podName:34686f03-565c-4d7b-a0d5-f3b2d93e77dd nodeName:}" failed. No retries permitted until 2026-04-06 12:12:34.549657623 +0000 UTC m=+933.537400489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert") pod "nmstate-console-plugin-7b5ddc4dc7-hx2t5" (UID: "34686f03-565c-4d7b-a0d5-f3b2d93e77dd") : secret "plugin-serving-cert" not found Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.050311 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-nginx-conf\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.070383 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2l6v\" (UniqueName: \"kubernetes.io/projected/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-kube-api-access-n2l6v\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.070919 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8nf8\" (UniqueName: \"kubernetes.io/projected/fb4a8135-7355-406b-a851-dce0109face5-kube-api-access-l8nf8\") pod \"nmstate-handler-rdg86\" (UID: \"fb4a8135-7355-406b-a851-dce0109face5\") " pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.109379 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.126958 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86455fd4-p5ssg"] Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.127729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-trusted-ca-bundle\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-oauth-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-service-ca\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-oauth-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.150419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f22v\" (UniqueName: \"kubernetes.io/projected/1967f497-8bcd-4475-aa8d-2f1114947ca4-kube-api-access-4f22v\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.152925 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86455fd4-p5ssg"] Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.173406 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.252880 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-oauth-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254225 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-service-ca\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254312 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-oauth-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254374 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f22v\" (UniqueName: \"kubernetes.io/projected/1967f497-8bcd-4475-aa8d-2f1114947ca4-kube-api-access-4f22v\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.254446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-trusted-ca-bundle\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.255181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.255216 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-oauth-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.255985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-service-ca\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.256951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1967f497-8bcd-4475-aa8d-2f1114947ca4-trusted-ca-bundle\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.263688 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-oauth-config\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.263725 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1967f497-8bcd-4475-aa8d-2f1114947ca4-console-serving-cert\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.269870 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f22v\" (UniqueName: \"kubernetes.io/projected/1967f497-8bcd-4475-aa8d-2f1114947ca4-kube-api-access-4f22v\") pod \"console-86455fd4-p5ssg\" (UID: \"1967f497-8bcd-4475-aa8d-2f1114947ca4\") " pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.330133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-khm89"] Apr 06 12:12:34 crc kubenswrapper[4790]: W0406 12:12:34.337572 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf52e27_af4b_4863_9e09_a3f11f497db9.slice/crio-34692c663921d0589b95c933627ca2a27beb55c29e9ab8e6c04559963ca5ed93 WatchSource:0}: Error finding container 34692c663921d0589b95c933627ca2a27beb55c29e9ab8e6c04559963ca5ed93: Status 404 returned error can't find the container with id 34692c663921d0589b95c933627ca2a27beb55c29e9ab8e6c04559963ca5ed93 Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.455964 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.459499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8691bf62-154a-4c5b-8d00-066f07c030fa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-5vn4v\" (UID: \"8691bf62-154a-4c5b-8d00-066f07c030fa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.471385 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.501202 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdg86" event={"ID":"fb4a8135-7355-406b-a851-dce0109face5","Type":"ContainerStarted","Data":"fa8bf7d8619cdec85c0c2819a8203af0bf02fefdcb645ec413f02fbeb09d7c0e"} Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.502303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" event={"ID":"baf52e27-af4b-4863-9e09-a3f11f497db9","Type":"ContainerStarted","Data":"34692c663921d0589b95c933627ca2a27beb55c29e9ab8e6c04559963ca5ed93"} Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.557066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.561055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34686f03-565c-4d7b-a0d5-f3b2d93e77dd-plugin-serving-cert\") pod \"nmstate-console-plugin-7b5ddc4dc7-hx2t5\" (UID: \"34686f03-565c-4d7b-a0d5-f3b2d93e77dd\") " pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.716528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.831561 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.844993 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86455fd4-p5ssg"] Apr 06 12:12:34 crc kubenswrapper[4790]: W0406 12:12:34.862436 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1967f497_8bcd_4475_aa8d_2f1114947ca4.slice/crio-af644aa97aa9a93c78d8907cbd5434100a62a6716b3ba61824d09598ffd77b6e WatchSource:0}: Error finding container af644aa97aa9a93c78d8907cbd5434100a62a6716b3ba61824d09598ffd77b6e: Status 404 returned error can't find the container with id af644aa97aa9a93c78d8907cbd5434100a62a6716b3ba61824d09598ffd77b6e Apr 06 12:12:34 crc kubenswrapper[4790]: I0406 12:12:34.884910 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v"] Apr 06 12:12:34 crc kubenswrapper[4790]: W0406 12:12:34.897005 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8691bf62_154a_4c5b_8d00_066f07c030fa.slice/crio-4ef6c7edb66d3b990d01d85c800da54e0a36282b9c54c1eeff9c8857c41b3c60 WatchSource:0}: Error finding container 4ef6c7edb66d3b990d01d85c800da54e0a36282b9c54c1eeff9c8857c41b3c60: Status 404 returned error can't find the container with id 4ef6c7edb66d3b990d01d85c800da54e0a36282b9c54c1eeff9c8857c41b3c60 Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.053585 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5"] Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.511069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" event={"ID":"8691bf62-154a-4c5b-8d00-066f07c030fa","Type":"ContainerStarted","Data":"4ef6c7edb66d3b990d01d85c800da54e0a36282b9c54c1eeff9c8857c41b3c60"} Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.513212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86455fd4-p5ssg" event={"ID":"1967f497-8bcd-4475-aa8d-2f1114947ca4","Type":"ContainerStarted","Data":"42a89d1e22976096210e750bb70f5163218dea332a02902f92dce84bc14f4547"} Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.513260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86455fd4-p5ssg" event={"ID":"1967f497-8bcd-4475-aa8d-2f1114947ca4","Type":"ContainerStarted","Data":"af644aa97aa9a93c78d8907cbd5434100a62a6716b3ba61824d09598ffd77b6e"} Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.516331 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" event={"ID":"34686f03-565c-4d7b-a0d5-f3b2d93e77dd","Type":"ContainerStarted","Data":"794a43bcd2eec7cabb83b78d75063ae7fcca718576a5257685b7f5d279c1a9d4"} Apr 06 12:12:35 crc kubenswrapper[4790]: I0406 12:12:35.540665 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86455fd4-p5ssg" podStartSLOduration=1.540644775 podStartE2EDuration="1.540644775s" podCreationTimestamp="2026-04-06 12:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:12:35.539898245 +0000 UTC m=+934.527641121" watchObservedRunningTime="2026-04-06 12:12:35.540644775 +0000 UTC m=+934.528387651" Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.531792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" event={"ID":"baf52e27-af4b-4863-9e09-a3f11f497db9","Type":"ContainerStarted","Data":"54fbbb7eb99f14a25507e1a9dcc2dac30d910bd82c8a7a362feb94254e973232"} Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.533786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdg86" event={"ID":"fb4a8135-7355-406b-a851-dce0109face5","Type":"ContainerStarted","Data":"e73bb2013f831315dfd10fb56ae1705bdea4e8b327627f788c15fed9fc9984c2"} Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.533934 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.535897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" event={"ID":"8691bf62-154a-4c5b-8d00-066f07c030fa","Type":"ContainerStarted","Data":"3daed8b0b0a2588cd9580e297989478e5a7ef0690196d6654e647294db52d165"} Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.536061 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.554974 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rdg86" podStartSLOduration=2.116613891 podStartE2EDuration="4.55495834s" podCreationTimestamp="2026-04-06 12:12:33 +0000 UTC" firstStartedPulling="2026-04-06 12:12:34.20275066 +0000 UTC m=+933.190493526" lastFinishedPulling="2026-04-06 12:12:36.641095089 +0000 UTC m=+935.628837975" observedRunningTime="2026-04-06 12:12:37.548052521 +0000 UTC m=+936.535795387" watchObservedRunningTime="2026-04-06 12:12:37.55495834 +0000 UTC m=+936.542701206" Apr 06 12:12:37 crc kubenswrapper[4790]: I0406 12:12:37.563853 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" podStartSLOduration=2.819949675 podStartE2EDuration="4.563820062s" podCreationTimestamp="2026-04-06 12:12:33 +0000 UTC" firstStartedPulling="2026-04-06 12:12:34.899142344 +0000 UTC m=+933.886885250" lastFinishedPulling="2026-04-06 12:12:36.643012771 +0000 UTC m=+935.630755637" observedRunningTime="2026-04-06 12:12:37.562367702 +0000 UTC m=+936.550110568" watchObservedRunningTime="2026-04-06 12:12:37.563820062 +0000 UTC m=+936.551562928" Apr 06 12:12:38 crc kubenswrapper[4790]: I0406 12:12:38.563753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" event={"ID":"34686f03-565c-4d7b-a0d5-f3b2d93e77dd","Type":"ContainerStarted","Data":"d4d7981a433fab5045af188f9a519ffc9e1075cb9bb81224e8743bdfa12b12ef"} Apr 06 12:12:38 crc kubenswrapper[4790]: I0406 12:12:38.580538 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7b5ddc4dc7-hx2t5" podStartSLOduration=3.032356313 podStartE2EDuration="5.580516135s" podCreationTimestamp="2026-04-06 12:12:33 +0000 UTC" firstStartedPulling="2026-04-06 12:12:35.067441576 +0000 UTC m=+934.055184442" lastFinishedPulling="2026-04-06 12:12:37.615601388 +0000 UTC m=+936.603344264" observedRunningTime="2026-04-06 12:12:38.576148995 +0000 UTC m=+937.563891881" watchObservedRunningTime="2026-04-06 12:12:38.580516135 +0000 UTC m=+937.568259001" Apr 06 12:12:39 crc kubenswrapper[4790]: I0406 12:12:39.570701 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" event={"ID":"baf52e27-af4b-4863-9e09-a3f11f497db9","Type":"ContainerStarted","Data":"0ef432a70d6e071388fe8d66b63155ba9b5b3e771d9565febd39bdbdc953f468"} Apr 06 12:12:39 crc kubenswrapper[4790]: I0406 12:12:39.593697 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-khm89" podStartSLOduration=1.563551966 podStartE2EDuration="6.59367237s" podCreationTimestamp="2026-04-06 12:12:33 +0000 UTC" firstStartedPulling="2026-04-06 12:12:34.339243362 +0000 UTC m=+933.326986228" lastFinishedPulling="2026-04-06 12:12:39.369363756 +0000 UTC m=+938.357106632" observedRunningTime="2026-04-06 12:12:39.585505377 +0000 UTC m=+938.573248253" watchObservedRunningTime="2026-04-06 12:12:39.59367237 +0000 UTC m=+938.581415236" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.202049 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rdg86" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.472119 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.472452 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.476786 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.616101 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86455fd4-p5ssg" Apr 06 12:12:44 crc kubenswrapper[4790]: I0406 12:12:44.662561 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 12:12:54 crc kubenswrapper[4790]: I0406 12:12:54.722735 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-5vn4v" Apr 06 12:13:02 crc kubenswrapper[4790]: I0406 12:13:02.899710 4790 scope.go:117] "RemoveContainer" containerID="d8e505868de294deb67b02738f8db679087fa521711cb29c9fc253f51753de24" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.022808 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj"] Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.024724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.026657 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.039944 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj"] Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.133891 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvpd\" (UniqueName: \"kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.134026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.134064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.234907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.234974 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.235011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvpd\" (UniqueName: \"kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.235496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.235621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.267599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvpd\" (UniqueName: \"kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd\") pod \"4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.350614 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.765483 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj"] Apr 06 12:13:08 crc kubenswrapper[4790]: I0406 12:13:08.814980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" event={"ID":"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c","Type":"ContainerStarted","Data":"b884525679864e0f28d1f102df5b15fbf146c2c77bdd907a5ed142d5db86afe8"} Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.700288 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6db6cf4595-5zmct" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" containerID="cri-o://b13df9e73e6137a2a5fb643678883a1dd14a11e203af679e2e49c2e630a571ab" gracePeriod=15 Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.824775 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db6cf4595-5zmct_42775b02-6f50-4862-ae70-7cdb1800baa7/console/0.log" Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.824863 4790 generic.go:334] "Generic (PLEG): container finished" podID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerID="b13df9e73e6137a2a5fb643678883a1dd14a11e203af679e2e49c2e630a571ab" exitCode=2 Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.824976 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db6cf4595-5zmct" event={"ID":"42775b02-6f50-4862-ae70-7cdb1800baa7","Type":"ContainerDied","Data":"b13df9e73e6137a2a5fb643678883a1dd14a11e203af679e2e49c2e630a571ab"} Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.826800 4790 generic.go:334] "Generic (PLEG): container finished" podID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerID="2b963ecf9e426bfef70027cb8ad543304003c5a2f35b52b540ac49dc524ec505" exitCode=0 Apr 06 12:13:09 crc kubenswrapper[4790]: I0406 12:13:09.826850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" event={"ID":"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c","Type":"ContainerDied","Data":"2b963ecf9e426bfef70027cb8ad543304003c5a2f35b52b540ac49dc524ec505"} Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.051754 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db6cf4595-5zmct_42775b02-6f50-4862-ae70-7cdb1800baa7/console/0.log" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.051818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158551 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsmcb\" (UniqueName: \"kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.158948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert\") pod \"42775b02-6f50-4862-ae70-7cdb1800baa7\" (UID: \"42775b02-6f50-4862-ae70-7cdb1800baa7\") " Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.159675 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca" (OuterVolumeSpecName: "service-ca") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.159667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.159736 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config" (OuterVolumeSpecName: "console-config") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.160033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.164174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.164305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb" (OuterVolumeSpecName: "kube-api-access-qsmcb") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "kube-api-access-qsmcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.164484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42775b02-6f50-4862-ae70-7cdb1800baa7" (UID: "42775b02-6f50-4862-ae70-7cdb1800baa7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260423 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260459 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsmcb\" (UniqueName: \"kubernetes.io/projected/42775b02-6f50-4862-ae70-7cdb1800baa7-kube-api-access-qsmcb\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260479 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260488 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-service-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260497 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42775b02-6f50-4862-ae70-7cdb1800baa7-console-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.260506 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42775b02-6f50-4862-ae70-7cdb1800baa7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.839432 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db6cf4595-5zmct_42775b02-6f50-4862-ae70-7cdb1800baa7/console/0.log" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.839927 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db6cf4595-5zmct" event={"ID":"42775b02-6f50-4862-ae70-7cdb1800baa7","Type":"ContainerDied","Data":"77c17226fb6e866918ae83a6fe99966b7b948bb7bb891e2749b73bb09f2a63be"} Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.839984 4790 scope.go:117] "RemoveContainer" containerID="b13df9e73e6137a2a5fb643678883a1dd14a11e203af679e2e49c2e630a571ab" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.840048 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db6cf4595-5zmct" Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.888476 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 12:13:10 crc kubenswrapper[4790]: I0406 12:13:10.891979 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6db6cf4595-5zmct"] Apr 06 12:13:11 crc kubenswrapper[4790]: I0406 12:13:11.688400 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" path="/var/lib/kubelet/pods/42775b02-6f50-4862-ae70-7cdb1800baa7/volumes" Apr 06 12:13:11 crc kubenswrapper[4790]: I0406 12:13:11.861580 4790 generic.go:334] "Generic (PLEG): container finished" podID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerID="d20c8e3d25bf170195ee25347572a0a594c7da9e6793d21c73af752c863df465" exitCode=0 Apr 06 12:13:11 crc kubenswrapper[4790]: I0406 12:13:11.861645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" event={"ID":"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c","Type":"ContainerDied","Data":"d20c8e3d25bf170195ee25347572a0a594c7da9e6793d21c73af752c863df465"} Apr 06 12:13:12 crc kubenswrapper[4790]: I0406 12:13:12.874478 4790 generic.go:334] "Generic (PLEG): container finished" podID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerID="b1a1fa73d5e67e897e8f26ec68d99f26f22efb6e39a93ea14365841a33364965" exitCode=0 Apr 06 12:13:12 crc kubenswrapper[4790]: I0406 12:13:12.874610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" event={"ID":"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c","Type":"ContainerDied","Data":"b1a1fa73d5e67e897e8f26ec68d99f26f22efb6e39a93ea14365841a33364965"} Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.178383 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.327067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvpd\" (UniqueName: \"kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd\") pod \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.327306 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle\") pod \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.327348 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util\") pod \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\" (UID: \"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c\") " Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.328560 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle" (OuterVolumeSpecName: "bundle") pod "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" (UID: "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.332983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd" (OuterVolumeSpecName: "kube-api-access-rzvpd") pod "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" (UID: "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c"). InnerVolumeSpecName "kube-api-access-rzvpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.344142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util" (OuterVolumeSpecName: "util") pod "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" (UID: "89f6afcc-5eee-48c2-88fe-2bf924bd9a0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.428787 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.428853 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.428867 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvpd\" (UniqueName: \"kubernetes.io/projected/89f6afcc-5eee-48c2-88fe-2bf924bd9a0c-kube-api-access-rzvpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.896697 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.896588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj" event={"ID":"89f6afcc-5eee-48c2-88fe-2bf924bd9a0c","Type":"ContainerDied","Data":"b884525679864e0f28d1f102df5b15fbf146c2c77bdd907a5ed142d5db86afe8"} Apr 06 12:13:14 crc kubenswrapper[4790]: I0406 12:13:14.900092 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b884525679864e0f28d1f102df5b15fbf146c2c77bdd907a5ed142d5db86afe8" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.070357 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt"] Apr 06 12:13:22 crc kubenswrapper[4790]: E0406 12:13:22.071197 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="extract" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071213 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="extract" Apr 06 12:13:22 crc kubenswrapper[4790]: E0406 12:13:22.071228 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071235 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" Apr 06 12:13:22 crc kubenswrapper[4790]: E0406 12:13:22.071249 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="pull" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071256 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="pull" Apr 06 12:13:22 crc kubenswrapper[4790]: E0406 12:13:22.071273 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="util" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071280 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="util" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="42775b02-6f50-4862-ae70-7cdb1800baa7" containerName="console" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f6afcc-5eee-48c2-88fe-2bf924bd9a0c" containerName="extract" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.071916 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.075064 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.075095 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.075170 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.075307 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.075469 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xwxwg" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.082029 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt"] Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.230095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzx4\" (UniqueName: \"kubernetes.io/projected/575f4b30-b909-40a3-aeee-d31e6b9238d5-kube-api-access-jnzx4\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.230176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-apiservice-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.230250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-webhook-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.303502 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk"] Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.304372 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.307694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.308924 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-64xbr" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.309140 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.331176 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzx4\" (UniqueName: \"kubernetes.io/projected/575f4b30-b909-40a3-aeee-d31e6b9238d5-kube-api-access-jnzx4\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.331237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-apiservice-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.331273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-webhook-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.333044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk"] Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.338751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-webhook-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.340191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/575f4b30-b909-40a3-aeee-d31e6b9238d5-apiservice-cert\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.358682 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzx4\" (UniqueName: \"kubernetes.io/projected/575f4b30-b909-40a3-aeee-d31e6b9238d5-kube-api-access-jnzx4\") pod \"metallb-operator-controller-manager-cc74d7bc4-98drt\" (UID: \"575f4b30-b909-40a3-aeee-d31e6b9238d5\") " pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.421200 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.432804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-apiservice-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.432904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvxt\" (UniqueName: \"kubernetes.io/projected/d56b74e9-8981-420d-81d5-4b1a20286d52-kube-api-access-swvxt\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:22 crc kubenswrapper[4790]: I0406 12:13:22.432924 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-webhook-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.534415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-apiservice-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.534717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-webhook-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.534736 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvxt\" (UniqueName: \"kubernetes.io/projected/d56b74e9-8981-420d-81d5-4b1a20286d52-kube-api-access-swvxt\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.547764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-webhook-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.548734 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d56b74e9-8981-420d-81d5-4b1a20286d52-apiservice-cert\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.555059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvxt\" (UniqueName: \"kubernetes.io/projected/d56b74e9-8981-420d-81d5-4b1a20286d52-kube-api-access-swvxt\") pod \"metallb-operator-webhook-server-54f9f8cc7-p9gpk\" (UID: \"d56b74e9-8981-420d-81d5-4b1a20286d52\") " pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:22.619329 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:23.398932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt"] Apr 06 12:13:23 crc kubenswrapper[4790]: W0406 12:13:23.435783 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575f4b30_b909_40a3_aeee_d31e6b9238d5.slice/crio-eeabbf21588e01c1307051a326a9effce301a30f6032aa747a2d39ec93ed6c90 WatchSource:0}: Error finding container eeabbf21588e01c1307051a326a9effce301a30f6032aa747a2d39ec93ed6c90: Status 404 returned error can't find the container with id eeabbf21588e01c1307051a326a9effce301a30f6032aa747a2d39ec93ed6c90 Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:23.645030 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk"] Apr 06 12:13:23 crc kubenswrapper[4790]: W0406 12:13:23.652651 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd56b74e9_8981_420d_81d5_4b1a20286d52.slice/crio-dba64366fdad9561e8c26b4268d0e6f019fc82c2c9efb66697a8be8f4adc0c0d WatchSource:0}: Error finding container dba64366fdad9561e8c26b4268d0e6f019fc82c2c9efb66697a8be8f4adc0c0d: Status 404 returned error can't find the container with id dba64366fdad9561e8c26b4268d0e6f019fc82c2c9efb66697a8be8f4adc0c0d Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:23.962773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" event={"ID":"d56b74e9-8981-420d-81d5-4b1a20286d52","Type":"ContainerStarted","Data":"dba64366fdad9561e8c26b4268d0e6f019fc82c2c9efb66697a8be8f4adc0c0d"} Apr 06 12:13:23 crc kubenswrapper[4790]: I0406 12:13:23.976365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" event={"ID":"575f4b30-b909-40a3-aeee-d31e6b9238d5","Type":"ContainerStarted","Data":"eeabbf21588e01c1307051a326a9effce301a30f6032aa747a2d39ec93ed6c90"} Apr 06 12:13:27 crc kubenswrapper[4790]: I0406 12:13:27.003321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" event={"ID":"575f4b30-b909-40a3-aeee-d31e6b9238d5","Type":"ContainerStarted","Data":"0d97fd623d8df5bd04d796cd0326f88285efb6b42948565a44793a0499a964b4"} Apr 06 12:13:27 crc kubenswrapper[4790]: I0406 12:13:27.003915 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:13:27 crc kubenswrapper[4790]: I0406 12:13:27.038647 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" podStartSLOduration=1.7255502950000001 podStartE2EDuration="5.036819021s" podCreationTimestamp="2026-04-06 12:13:22 +0000 UTC" firstStartedPulling="2026-04-06 12:13:23.442348961 +0000 UTC m=+982.430091827" lastFinishedPulling="2026-04-06 12:13:26.753617687 +0000 UTC m=+985.741360553" observedRunningTime="2026-04-06 12:13:27.031919288 +0000 UTC m=+986.019662154" watchObservedRunningTime="2026-04-06 12:13:27.036819021 +0000 UTC m=+986.024561887" Apr 06 12:13:29 crc kubenswrapper[4790]: I0406 12:13:29.016358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" event={"ID":"d56b74e9-8981-420d-81d5-4b1a20286d52","Type":"ContainerStarted","Data":"24c46d2a009a1e2090a59c6b626345dbd90df69e0b5a2ffaf9540ae139be656c"} Apr 06 12:13:29 crc kubenswrapper[4790]: I0406 12:13:29.017562 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:13:29 crc kubenswrapper[4790]: I0406 12:13:29.037698 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" podStartSLOduration=2.180739933 podStartE2EDuration="7.037681665s" podCreationTimestamp="2026-04-06 12:13:22 +0000 UTC" firstStartedPulling="2026-04-06 12:13:23.656644625 +0000 UTC m=+982.644387501" lastFinishedPulling="2026-04-06 12:13:28.513586367 +0000 UTC m=+987.501329233" observedRunningTime="2026-04-06 12:13:29.035876276 +0000 UTC m=+988.023619152" watchObservedRunningTime="2026-04-06 12:13:29.037681665 +0000 UTC m=+988.025424531" Apr 06 12:13:42 crc kubenswrapper[4790]: I0406 12:13:42.623744 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54f9f8cc7-p9gpk" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.142855 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591294-25dml"] Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.144251 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.146722 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.146921 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.149571 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.154202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591294-25dml"] Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.235058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4m7\" (UniqueName: \"kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7\") pod \"auto-csr-approver-29591294-25dml\" (UID: \"bb6d2231-f726-41d3-b2ff-1bae2b437000\") " pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.336624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4m7\" (UniqueName: \"kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7\") pod \"auto-csr-approver-29591294-25dml\" (UID: \"bb6d2231-f726-41d3-b2ff-1bae2b437000\") " pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.360564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4m7\" (UniqueName: \"kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7\") pod \"auto-csr-approver-29591294-25dml\" (UID: \"bb6d2231-f726-41d3-b2ff-1bae2b437000\") " pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.464987 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:00 crc kubenswrapper[4790]: I0406 12:14:00.684126 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591294-25dml"] Apr 06 12:14:01 crc kubenswrapper[4790]: I0406 12:14:01.249050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591294-25dml" event={"ID":"bb6d2231-f726-41d3-b2ff-1bae2b437000","Type":"ContainerStarted","Data":"347b49848eda1484fc9d1475b88b8e724314b1da4c811f1bd6c4181262d71c40"} Apr 06 12:14:02 crc kubenswrapper[4790]: I0406 12:14:02.256161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591294-25dml" event={"ID":"bb6d2231-f726-41d3-b2ff-1bae2b437000","Type":"ContainerStarted","Data":"8e977830766d57bcc1fe1f162f2b966a9e00e277c56ed8f6d3583bcc714a0077"} Apr 06 12:14:02 crc kubenswrapper[4790]: I0406 12:14:02.273148 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591294-25dml" podStartSLOduration=1.140054072 podStartE2EDuration="2.273128242s" podCreationTimestamp="2026-04-06 12:14:00 +0000 UTC" firstStartedPulling="2026-04-06 12:14:00.688444529 +0000 UTC m=+1019.676187395" lastFinishedPulling="2026-04-06 12:14:01.821518659 +0000 UTC m=+1020.809261565" observedRunningTime="2026-04-06 12:14:02.267814338 +0000 UTC m=+1021.255557214" watchObservedRunningTime="2026-04-06 12:14:02.273128242 +0000 UTC m=+1021.260871118" Apr 06 12:14:02 crc kubenswrapper[4790]: I0406 12:14:02.424745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cc74d7bc4-98drt" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.135724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kv7hm"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.139754 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.141328 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-dc9xx" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.141790 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.141953 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.142162 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.143157 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.145230 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.157193 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0b643-0c2a-467b-9267-caaba887289b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-conf\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174352 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-reloader\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-sockets\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174436 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/64824d1f-c308-4727-8209-26463699ba84-frr-startup\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-metrics\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7pm\" (UniqueName: \"kubernetes.io/projected/64824d1f-c308-4727-8209-26463699ba84-kube-api-access-jn7pm\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.174775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbrv9\" (UniqueName: \"kubernetes.io/projected/7dd0b643-0c2a-467b-9267-caaba887289b-kube-api-access-tbrv9\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.217217 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zlz5n"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.218290 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.220151 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lhntv" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.220720 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.221099 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.224982 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.236788 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bb64cd5d7-lfmpj"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.238034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.241782 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.253915 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bb64cd5d7-lfmpj"] Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.265104 4790 generic.go:334] "Generic (PLEG): container finished" podID="bb6d2231-f726-41d3-b2ff-1bae2b437000" containerID="8e977830766d57bcc1fe1f162f2b966a9e00e277c56ed8f6d3583bcc714a0077" exitCode=0 Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.265150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591294-25dml" event={"ID":"bb6d2231-f726-41d3-b2ff-1bae2b437000","Type":"ContainerDied","Data":"8e977830766d57bcc1fe1f162f2b966a9e00e277c56ed8f6d3583bcc714a0077"} Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.277821 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-sockets\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.277948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/64824d1f-c308-4727-8209-26463699ba84-frr-startup\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.277977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-metrics\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7pm\" (UniqueName: \"kubernetes.io/projected/64824d1f-c308-4727-8209-26463699ba84-kube-api-access-jn7pm\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-cert\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278054 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsrg\" (UniqueName: \"kubernetes.io/projected/fb497468-6169-47da-879b-96e49435e345-kube-api-access-9wsrg\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278075 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j97m\" (UniqueName: \"kubernetes.io/projected/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-kube-api-access-4j97m\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278094 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbrv9\" (UniqueName: \"kubernetes.io/projected/7dd0b643-0c2a-467b-9267-caaba887289b-kube-api-access-tbrv9\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0b643-0c2a-467b-9267-caaba887289b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278152 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-conf\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-reloader\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278221 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metallb-excludel2\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.278681 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-sockets\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.279179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-frr-conf\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.279203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-reloader\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.279282 4790 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.279331 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs podName:64824d1f-c308-4727-8209-26463699ba84 nodeName:}" failed. No retries permitted until 2026-04-06 12:14:03.779316201 +0000 UTC m=+1022.767059067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs") pod "frr-k8s-kv7hm" (UID: "64824d1f-c308-4727-8209-26463699ba84") : secret "frr-k8s-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.279758 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/64824d1f-c308-4727-8209-26463699ba84-frr-startup\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.282562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/64824d1f-c308-4727-8209-26463699ba84-metrics\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.292892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dd0b643-0c2a-467b-9267-caaba887289b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.303651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbrv9\" (UniqueName: \"kubernetes.io/projected/7dd0b643-0c2a-467b-9267-caaba887289b-kube-api-access-tbrv9\") pod \"frr-k8s-webhook-server-bcc4b6f68-j6z5c\" (UID: \"7dd0b643-0c2a-467b-9267-caaba887289b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.304248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7pm\" (UniqueName: \"kubernetes.io/projected/64824d1f-c308-4727-8209-26463699ba84-kube-api-access-jn7pm\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.379331 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.379678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379520 4790 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.379774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379803 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs podName:fb497468-6169-47da-879b-96e49435e345 nodeName:}" failed. No retries permitted until 2026-04-06 12:14:03.879781406 +0000 UTC m=+1022.867524272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs") pod "controller-5bb64cd5d7-lfmpj" (UID: "fb497468-6169-47da-879b-96e49435e345") : secret "controller-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379818 4790 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379878 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379885 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs podName:5ca71ea5-d4c2-497d-945e-ac51b1fbf618 nodeName:}" failed. No retries permitted until 2026-04-06 12:14:03.879869169 +0000 UTC m=+1022.867612025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs") pod "speaker-zlz5n" (UID: "5ca71ea5-d4c2-497d-945e-ac51b1fbf618") : secret "speaker-certs-secret" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.379910 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist podName:5ca71ea5-d4c2-497d-945e-ac51b1fbf618 nodeName:}" failed. No retries permitted until 2026-04-06 12:14:03.87989982 +0000 UTC m=+1022.867642686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist") pod "speaker-zlz5n" (UID: "5ca71ea5-d4c2-497d-945e-ac51b1fbf618") : secret "metallb-memberlist" not found Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.379902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metallb-excludel2\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.379998 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-cert\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.380019 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsrg\" (UniqueName: \"kubernetes.io/projected/fb497468-6169-47da-879b-96e49435e345-kube-api-access-9wsrg\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.380039 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j97m\" (UniqueName: \"kubernetes.io/projected/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-kube-api-access-4j97m\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.380474 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metallb-excludel2\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.382130 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.396396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-cert\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.399127 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsrg\" (UniqueName: \"kubernetes.io/projected/fb497468-6169-47da-879b-96e49435e345-kube-api-access-9wsrg\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.399236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j97m\" (UniqueName: \"kubernetes.io/projected/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-kube-api-access-4j97m\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.466657 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.645752 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c"] Apr 06 12:14:03 crc kubenswrapper[4790]: W0406 12:14:03.646288 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd0b643_0c2a_467b_9267_caaba887289b.slice/crio-4f55b1d041fa93267c08cff7d4b5ff95201ae3e051928cc989fef3e248b671d4 WatchSource:0}: Error finding container 4f55b1d041fa93267c08cff7d4b5ff95201ae3e051928cc989fef3e248b671d4: Status 404 returned error can't find the container with id 4f55b1d041fa93267c08cff7d4b5ff95201ae3e051928cc989fef3e248b671d4 Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.785232 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.789521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64824d1f-c308-4727-8209-26463699ba84-metrics-certs\") pod \"frr-k8s-kv7hm\" (UID: \"64824d1f-c308-4727-8209-26463699ba84\") " pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.887002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.887063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.887094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.887283 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Apr 06 12:14:03 crc kubenswrapper[4790]: E0406 12:14:03.887339 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist podName:5ca71ea5-d4c2-497d-945e-ac51b1fbf618 nodeName:}" failed. No retries permitted until 2026-04-06 12:14:04.887320966 +0000 UTC m=+1023.875063842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist") pod "speaker-zlz5n" (UID: "5ca71ea5-d4c2-497d-945e-ac51b1fbf618") : secret "metallb-memberlist" not found Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.890251 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb497468-6169-47da-879b-96e49435e345-metrics-certs\") pod \"controller-5bb64cd5d7-lfmpj\" (UID: \"fb497468-6169-47da-879b-96e49435e345\") " pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:03 crc kubenswrapper[4790]: I0406 12:14:03.890739 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-metrics-certs\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.057135 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.150752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.308084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"584d9a96341f0d35015f03f9c1edb270d12e336b66256b8fccf812c2d61cfc32"} Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.311923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" event={"ID":"7dd0b643-0c2a-467b-9267-caaba887289b","Type":"ContainerStarted","Data":"4f55b1d041fa93267c08cff7d4b5ff95201ae3e051928cc989fef3e248b671d4"} Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.415853 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bb64cd5d7-lfmpj"] Apr 06 12:14:04 crc kubenswrapper[4790]: W0406 12:14:04.420795 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb497468_6169_47da_879b_96e49435e345.slice/crio-3c9f108f5ab83a75bd34f24295196df12f21febb2bbe3f9122f2315b4572b2d2 WatchSource:0}: Error finding container 3c9f108f5ab83a75bd34f24295196df12f21febb2bbe3f9122f2315b4572b2d2: Status 404 returned error can't find the container with id 3c9f108f5ab83a75bd34f24295196df12f21febb2bbe3f9122f2315b4572b2d2 Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.563451 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.596212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz4m7\" (UniqueName: \"kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7\") pod \"bb6d2231-f726-41d3-b2ff-1bae2b437000\" (UID: \"bb6d2231-f726-41d3-b2ff-1bae2b437000\") " Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.601379 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7" (OuterVolumeSpecName: "kube-api-access-hz4m7") pod "bb6d2231-f726-41d3-b2ff-1bae2b437000" (UID: "bb6d2231-f726-41d3-b2ff-1bae2b437000"). InnerVolumeSpecName "kube-api-access-hz4m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.697914 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz4m7\" (UniqueName: \"kubernetes.io/projected/bb6d2231-f726-41d3-b2ff-1bae2b437000-kube-api-access-hz4m7\") on node \"crc\" DevicePath \"\"" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.753648 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591288-k92zx"] Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.764063 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591288-k92zx"] Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.900686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:04 crc kubenswrapper[4790]: I0406 12:14:04.904978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ca71ea5-d4c2-497d-945e-ac51b1fbf618-memberlist\") pod \"speaker-zlz5n\" (UID: \"5ca71ea5-d4c2-497d-945e-ac51b1fbf618\") " pod="metallb-system/speaker-zlz5n" Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.031600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zlz5n" Apr 06 12:14:05 crc kubenswrapper[4790]: W0406 12:14:05.057958 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca71ea5_d4c2_497d_945e_ac51b1fbf618.slice/crio-f7ce6106896bd0c83ae789f443e0dbc78aa14709dbfae9acb740e05c64702431 WatchSource:0}: Error finding container f7ce6106896bd0c83ae789f443e0dbc78aa14709dbfae9acb740e05c64702431: Status 404 returned error can't find the container with id f7ce6106896bd0c83ae789f443e0dbc78aa14709dbfae9acb740e05c64702431 Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.327429 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zlz5n" event={"ID":"5ca71ea5-d4c2-497d-945e-ac51b1fbf618","Type":"ContainerStarted","Data":"5993a611b488db76b4b01d5a41a684ac77ef6a17cc54802244af5c516037fd32"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.327670 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zlz5n" event={"ID":"5ca71ea5-d4c2-497d-945e-ac51b1fbf618","Type":"ContainerStarted","Data":"f7ce6106896bd0c83ae789f443e0dbc78aa14709dbfae9acb740e05c64702431"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.329806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-lfmpj" event={"ID":"fb497468-6169-47da-879b-96e49435e345","Type":"ContainerStarted","Data":"35854e6c9830efb44f1a48f93e63037d223cfea47f9a3f91863a49d49897f891"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.329855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-lfmpj" event={"ID":"fb497468-6169-47da-879b-96e49435e345","Type":"ContainerStarted","Data":"25053b2c0f1efbc76da0cfc403b63a24e28ee2c53140998d4f51a28e215c9b68"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.329874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bb64cd5d7-lfmpj" event={"ID":"fb497468-6169-47da-879b-96e49435e345","Type":"ContainerStarted","Data":"3c9f108f5ab83a75bd34f24295196df12f21febb2bbe3f9122f2315b4572b2d2"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.330893 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.334179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591294-25dml" event={"ID":"bb6d2231-f726-41d3-b2ff-1bae2b437000","Type":"ContainerDied","Data":"347b49848eda1484fc9d1475b88b8e724314b1da4c811f1bd6c4181262d71c40"} Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.334212 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347b49848eda1484fc9d1475b88b8e724314b1da4c811f1bd6c4181262d71c40" Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.334299 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591294-25dml" Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.351705 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bb64cd5d7-lfmpj" podStartSLOduration=2.351689795 podStartE2EDuration="2.351689795s" podCreationTimestamp="2026-04-06 12:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:14:05.351256233 +0000 UTC m=+1024.338999099" watchObservedRunningTime="2026-04-06 12:14:05.351689795 +0000 UTC m=+1024.339432661" Apr 06 12:14:05 crc kubenswrapper[4790]: I0406 12:14:05.684426 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfb6550-5136-437e-addd-e7f853434761" path="/var/lib/kubelet/pods/ebfb6550-5136-437e-addd-e7f853434761/volumes" Apr 06 12:14:06 crc kubenswrapper[4790]: I0406 12:14:06.352440 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zlz5n" event={"ID":"5ca71ea5-d4c2-497d-945e-ac51b1fbf618","Type":"ContainerStarted","Data":"51645da3b101cc4af507903291aab6c26ecdb5fbcc6d7b894dc77d3b23c57781"} Apr 06 12:14:06 crc kubenswrapper[4790]: I0406 12:14:06.352495 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zlz5n" Apr 06 12:14:06 crc kubenswrapper[4790]: I0406 12:14:06.373017 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zlz5n" podStartSLOduration=3.372999504 podStartE2EDuration="3.372999504s" podCreationTimestamp="2026-04-06 12:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:14:06.368187793 +0000 UTC m=+1025.355930679" watchObservedRunningTime="2026-04-06 12:14:06.372999504 +0000 UTC m=+1025.360742370" Apr 06 12:14:09 crc kubenswrapper[4790]: I0406 12:14:09.753962 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:14:09 crc kubenswrapper[4790]: I0406 12:14:09.754300 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:14:11 crc kubenswrapper[4790]: I0406 12:14:11.392083 4790 generic.go:334] "Generic (PLEG): container finished" podID="64824d1f-c308-4727-8209-26463699ba84" containerID="7485079ef1b9f1a8e3331dc13f650c35f4728300790742be4ba3457d9263ad61" exitCode=0 Apr 06 12:14:11 crc kubenswrapper[4790]: I0406 12:14:11.392124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerDied","Data":"7485079ef1b9f1a8e3331dc13f650c35f4728300790742be4ba3457d9263ad61"} Apr 06 12:14:11 crc kubenswrapper[4790]: I0406 12:14:11.394038 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" event={"ID":"7dd0b643-0c2a-467b-9267-caaba887289b","Type":"ContainerStarted","Data":"ffcd63291fda28b6ecdb3d485ca720c5723bcec586c29be3362c1c4e04ead5a9"} Apr 06 12:14:11 crc kubenswrapper[4790]: I0406 12:14:11.394164 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:11 crc kubenswrapper[4790]: I0406 12:14:11.433210 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" podStartSLOduration=1.120567784 podStartE2EDuration="8.43317323s" podCreationTimestamp="2026-04-06 12:14:03 +0000 UTC" firstStartedPulling="2026-04-06 12:14:03.669083625 +0000 UTC m=+1022.656826481" lastFinishedPulling="2026-04-06 12:14:10.981689061 +0000 UTC m=+1029.969431927" observedRunningTime="2026-04-06 12:14:11.430819096 +0000 UTC m=+1030.418561992" watchObservedRunningTime="2026-04-06 12:14:11.43317323 +0000 UTC m=+1030.420916106" Apr 06 12:14:12 crc kubenswrapper[4790]: I0406 12:14:12.406394 4790 generic.go:334] "Generic (PLEG): container finished" podID="64824d1f-c308-4727-8209-26463699ba84" containerID="8bca100bd9931e31a33cd4ace0e1d248f7782cc991f53abdfaae49c7aafa8e8e" exitCode=0 Apr 06 12:14:12 crc kubenswrapper[4790]: I0406 12:14:12.406526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerDied","Data":"8bca100bd9931e31a33cd4ace0e1d248f7782cc991f53abdfaae49c7aafa8e8e"} Apr 06 12:14:13 crc kubenswrapper[4790]: I0406 12:14:13.414408 4790 generic.go:334] "Generic (PLEG): container finished" podID="64824d1f-c308-4727-8209-26463699ba84" containerID="04686d3cdc020d709aab5b49ce2ca1ff4b01632572e934afb11033104933f48f" exitCode=0 Apr 06 12:14:13 crc kubenswrapper[4790]: I0406 12:14:13.414458 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerDied","Data":"04686d3cdc020d709aab5b49ce2ca1ff4b01632572e934afb11033104933f48f"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.154782 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bb64cd5d7-lfmpj" Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"6699f8c3a2e5e94c8fe6404b93e29dc1c6e383013babfec36991da3f99e12884"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426595 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"98e9f766f60f36228f0cffa12a4146f9840650a01ed1f0effa503dcfe11c10d4"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"495962e9d6cdecf57d81217357b4a2406791381167aeca4f55e702ca8fa06d4a"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"39490b369848106654e3f2629ccb53a8f6834a823d52aa414f5ed83db66e2d62"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"228c1da5bdc69d95f0309065a34cbfa074699d91bb8489a26ae333fc8847515f"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.426677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kv7hm" event={"ID":"64824d1f-c308-4727-8209-26463699ba84","Type":"ContainerStarted","Data":"3091154d5c244e94b3c141e6b50e4d5e3c58f674e6cc29dfea102efd0b519efd"} Apr 06 12:14:14 crc kubenswrapper[4790]: I0406 12:14:14.450380 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kv7hm" podStartSLOduration=4.634090248 podStartE2EDuration="11.450361707s" podCreationTimestamp="2026-04-06 12:14:03 +0000 UTC" firstStartedPulling="2026-04-06 12:14:04.180649005 +0000 UTC m=+1023.168391871" lastFinishedPulling="2026-04-06 12:14:10.996920464 +0000 UTC m=+1029.984663330" observedRunningTime="2026-04-06 12:14:14.44863445 +0000 UTC m=+1033.436377316" watchObservedRunningTime="2026-04-06 12:14:14.450361707 +0000 UTC m=+1033.438104573" Apr 06 12:14:15 crc kubenswrapper[4790]: I0406 12:14:15.034894 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zlz5n" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.725281 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:17 crc kubenswrapper[4790]: E0406 12:14:17.725782 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6d2231-f726-41d3-b2ff-1bae2b437000" containerName="oc" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.725795 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6d2231-f726-41d3-b2ff-1bae2b437000" containerName="oc" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.725933 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6d2231-f726-41d3-b2ff-1bae2b437000" containerName="oc" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.726375 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.728245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-28hkn" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.728401 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.728944 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.736582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.766976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65f74\" (UniqueName: \"kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74\") pod \"openstack-operator-index-h8qw7\" (UID: \"70976f68-3143-4464-8f2f-be1dc0257238\") " pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.868315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65f74\" (UniqueName: \"kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74\") pod \"openstack-operator-index-h8qw7\" (UID: \"70976f68-3143-4464-8f2f-be1dc0257238\") " pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:17 crc kubenswrapper[4790]: I0406 12:14:17.888025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65f74\" (UniqueName: \"kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74\") pod \"openstack-operator-index-h8qw7\" (UID: \"70976f68-3143-4464-8f2f-be1dc0257238\") " pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:18 crc kubenswrapper[4790]: I0406 12:14:18.048847 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:18 crc kubenswrapper[4790]: I0406 12:14:18.440298 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:18 crc kubenswrapper[4790]: I0406 12:14:18.456351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8qw7" event={"ID":"70976f68-3143-4464-8f2f-be1dc0257238","Type":"ContainerStarted","Data":"c67d434fe47e87d02ac09211afdddb819764163df1d8ab548b643a6fae3829f0"} Apr 06 12:14:19 crc kubenswrapper[4790]: I0406 12:14:19.058140 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:19 crc kubenswrapper[4790]: I0406 12:14:19.102478 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.088326 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.478253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8qw7" event={"ID":"70976f68-3143-4464-8f2f-be1dc0257238","Type":"ContainerStarted","Data":"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3"} Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.493950 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h8qw7" podStartSLOduration=2.203494326 podStartE2EDuration="4.493925798s" podCreationTimestamp="2026-04-06 12:14:17 +0000 UTC" firstStartedPulling="2026-04-06 12:14:18.44599393 +0000 UTC m=+1037.433736796" lastFinishedPulling="2026-04-06 12:14:20.736425392 +0000 UTC m=+1039.724168268" observedRunningTime="2026-04-06 12:14:21.49141596 +0000 UTC m=+1040.479158846" watchObservedRunningTime="2026-04-06 12:14:21.493925798 +0000 UTC m=+1040.481668704" Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.701102 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s94ds"] Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.705044 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.717684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4xfr\" (UniqueName: \"kubernetes.io/projected/cd1445b3-61f1-4be0-aaf8-bee9a755cb7e-kube-api-access-f4xfr\") pod \"openstack-operator-index-s94ds\" (UID: \"cd1445b3-61f1-4be0-aaf8-bee9a755cb7e\") " pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.725441 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s94ds"] Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.818538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4xfr\" (UniqueName: \"kubernetes.io/projected/cd1445b3-61f1-4be0-aaf8-bee9a755cb7e-kube-api-access-f4xfr\") pod \"openstack-operator-index-s94ds\" (UID: \"cd1445b3-61f1-4be0-aaf8-bee9a755cb7e\") " pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:21 crc kubenswrapper[4790]: I0406 12:14:21.837605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4xfr\" (UniqueName: \"kubernetes.io/projected/cd1445b3-61f1-4be0-aaf8-bee9a755cb7e-kube-api-access-f4xfr\") pod \"openstack-operator-index-s94ds\" (UID: \"cd1445b3-61f1-4be0-aaf8-bee9a755cb7e\") " pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.031261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.485243 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-h8qw7" podUID="70976f68-3143-4464-8f2f-be1dc0257238" containerName="registry-server" containerID="cri-o://a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3" gracePeriod=2 Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.560397 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s94ds"] Apr 06 12:14:22 crc kubenswrapper[4790]: W0406 12:14:22.573031 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd1445b3_61f1_4be0_aaf8_bee9a755cb7e.slice/crio-497b0de28f1a51410bd42dffe87a7874c3719444de11309fabef7d37a47cd240 WatchSource:0}: Error finding container 497b0de28f1a51410bd42dffe87a7874c3719444de11309fabef7d37a47cd240: Status 404 returned error can't find the container with id 497b0de28f1a51410bd42dffe87a7874c3719444de11309fabef7d37a47cd240 Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.813222 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.849070 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65f74\" (UniqueName: \"kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74\") pod \"70976f68-3143-4464-8f2f-be1dc0257238\" (UID: \"70976f68-3143-4464-8f2f-be1dc0257238\") " Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.854273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74" (OuterVolumeSpecName: "kube-api-access-65f74") pod "70976f68-3143-4464-8f2f-be1dc0257238" (UID: "70976f68-3143-4464-8f2f-be1dc0257238"). InnerVolumeSpecName "kube-api-access-65f74". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:14:22 crc kubenswrapper[4790]: I0406 12:14:22.950400 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65f74\" (UniqueName: \"kubernetes.io/projected/70976f68-3143-4464-8f2f-be1dc0257238-kube-api-access-65f74\") on node \"crc\" DevicePath \"\"" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.473479 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j6z5c" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.495740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s94ds" event={"ID":"cd1445b3-61f1-4be0-aaf8-bee9a755cb7e","Type":"ContainerStarted","Data":"3231c6bdd84de67e056303400aeea5c0826e9de610a132113b3beb2c55e46152"} Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.495779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s94ds" event={"ID":"cd1445b3-61f1-4be0-aaf8-bee9a755cb7e","Type":"ContainerStarted","Data":"497b0de28f1a51410bd42dffe87a7874c3719444de11309fabef7d37a47cd240"} Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.497906 4790 generic.go:334] "Generic (PLEG): container finished" podID="70976f68-3143-4464-8f2f-be1dc0257238" containerID="a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3" exitCode=0 Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.497957 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8qw7" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.497962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8qw7" event={"ID":"70976f68-3143-4464-8f2f-be1dc0257238","Type":"ContainerDied","Data":"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3"} Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.498024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8qw7" event={"ID":"70976f68-3143-4464-8f2f-be1dc0257238","Type":"ContainerDied","Data":"c67d434fe47e87d02ac09211afdddb819764163df1d8ab548b643a6fae3829f0"} Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.498049 4790 scope.go:117] "RemoveContainer" containerID="a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.521449 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s94ds" podStartSLOduration=2.469464193 podStartE2EDuration="2.521432685s" podCreationTimestamp="2026-04-06 12:14:21 +0000 UTC" firstStartedPulling="2026-04-06 12:14:22.576914668 +0000 UTC m=+1041.564657544" lastFinishedPulling="2026-04-06 12:14:22.62888317 +0000 UTC m=+1041.616626036" observedRunningTime="2026-04-06 12:14:23.514507677 +0000 UTC m=+1042.502250543" watchObservedRunningTime="2026-04-06 12:14:23.521432685 +0000 UTC m=+1042.509175551" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.522134 4790 scope.go:117] "RemoveContainer" containerID="a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3" Apr 06 12:14:23 crc kubenswrapper[4790]: E0406 12:14:23.522589 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3\": container with ID starting with a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3 not found: ID does not exist" containerID="a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.522631 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3"} err="failed to get container status \"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3\": rpc error: code = NotFound desc = could not find container \"a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3\": container with ID starting with a38930dfdb3fafb303f0762d88718306a487bce92c59b96f2ba899c0762dddc3 not found: ID does not exist" Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.533896 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.541606 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-h8qw7"] Apr 06 12:14:23 crc kubenswrapper[4790]: I0406 12:14:23.683614 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70976f68-3143-4464-8f2f-be1dc0257238" path="/var/lib/kubelet/pods/70976f68-3143-4464-8f2f-be1dc0257238/volumes" Apr 06 12:14:24 crc kubenswrapper[4790]: I0406 12:14:24.069773 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kv7hm" Apr 06 12:14:32 crc kubenswrapper[4790]: I0406 12:14:32.031814 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:32 crc kubenswrapper[4790]: I0406 12:14:32.032207 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:32 crc kubenswrapper[4790]: I0406 12:14:32.077465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:32 crc kubenswrapper[4790]: I0406 12:14:32.607179 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-s94ds" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.145888 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm"] Apr 06 12:14:34 crc kubenswrapper[4790]: E0406 12:14:34.147044 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70976f68-3143-4464-8f2f-be1dc0257238" containerName="registry-server" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.148019 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="70976f68-3143-4464-8f2f-be1dc0257238" containerName="registry-server" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.148318 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="70976f68-3143-4464-8f2f-be1dc0257238" containerName="registry-server" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.149245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.151187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ff7wv" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.157621 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm"] Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.324742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.325121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hh9\" (UniqueName: \"kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.325153 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.426185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.426475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hh9\" (UniqueName: \"kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.426583 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.426661 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.426911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.447542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hh9\" (UniqueName: \"kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9\") pod \"ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.469950 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:34 crc kubenswrapper[4790]: I0406 12:14:34.669247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm"] Apr 06 12:14:34 crc kubenswrapper[4790]: W0406 12:14:34.673682 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a436f45_6a0e_4102_8377_fa7299c0b3e8.slice/crio-f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b WatchSource:0}: Error finding container f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b: Status 404 returned error can't find the container with id f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b Apr 06 12:14:35 crc kubenswrapper[4790]: I0406 12:14:35.600452 4790 generic.go:334] "Generic (PLEG): container finished" podID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerID="1b4b23e6cabcefdf37dc35a3a70e75bdceffc1a34cf42f9a4396d9b81b36b9ea" exitCode=0 Apr 06 12:14:35 crc kubenswrapper[4790]: I0406 12:14:35.600553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" event={"ID":"7a436f45-6a0e-4102-8377-fa7299c0b3e8","Type":"ContainerDied","Data":"1b4b23e6cabcefdf37dc35a3a70e75bdceffc1a34cf42f9a4396d9b81b36b9ea"} Apr 06 12:14:35 crc kubenswrapper[4790]: I0406 12:14:35.600712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" event={"ID":"7a436f45-6a0e-4102-8377-fa7299c0b3e8","Type":"ContainerStarted","Data":"f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b"} Apr 06 12:14:36 crc kubenswrapper[4790]: I0406 12:14:36.610534 4790 generic.go:334] "Generic (PLEG): container finished" podID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerID="874a9e2db47bbb4b8681673bdbe1f1c687444f4a3243f2aa4c613ebc5b97305a" exitCode=0 Apr 06 12:14:36 crc kubenswrapper[4790]: I0406 12:14:36.610596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" event={"ID":"7a436f45-6a0e-4102-8377-fa7299c0b3e8","Type":"ContainerDied","Data":"874a9e2db47bbb4b8681673bdbe1f1c687444f4a3243f2aa4c613ebc5b97305a"} Apr 06 12:14:37 crc kubenswrapper[4790]: I0406 12:14:37.624964 4790 generic.go:334] "Generic (PLEG): container finished" podID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerID="8d458367b3fbbf724f7f0a6b0066392c79db02cb021a5351aa1987dfad8e729f" exitCode=0 Apr 06 12:14:37 crc kubenswrapper[4790]: I0406 12:14:37.625033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" event={"ID":"7a436f45-6a0e-4102-8377-fa7299c0b3e8","Type":"ContainerDied","Data":"8d458367b3fbbf724f7f0a6b0066392c79db02cb021a5351aa1987dfad8e729f"} Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.870332 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.984651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9hh9\" (UniqueName: \"kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9\") pod \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.984712 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle\") pod \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.984799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util\") pod \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\" (UID: \"7a436f45-6a0e-4102-8377-fa7299c0b3e8\") " Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.985502 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle" (OuterVolumeSpecName: "bundle") pod "7a436f45-6a0e-4102-8377-fa7299c0b3e8" (UID: "7a436f45-6a0e-4102-8377-fa7299c0b3e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.989952 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9" (OuterVolumeSpecName: "kube-api-access-f9hh9") pod "7a436f45-6a0e-4102-8377-fa7299c0b3e8" (UID: "7a436f45-6a0e-4102-8377-fa7299c0b3e8"). InnerVolumeSpecName "kube-api-access-f9hh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:14:38 crc kubenswrapper[4790]: I0406 12:14:38.999756 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util" (OuterVolumeSpecName: "util") pod "7a436f45-6a0e-4102-8377-fa7299c0b3e8" (UID: "7a436f45-6a0e-4102-8377-fa7299c0b3e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.086159 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-util\") on node \"crc\" DevicePath \"\"" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.086193 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9hh9\" (UniqueName: \"kubernetes.io/projected/7a436f45-6a0e-4102-8377-fa7299c0b3e8-kube-api-access-f9hh9\") on node \"crc\" DevicePath \"\"" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.086205 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a436f45-6a0e-4102-8377-fa7299c0b3e8-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.636958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" event={"ID":"7a436f45-6a0e-4102-8377-fa7299c0b3e8","Type":"ContainerDied","Data":"f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b"} Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.636994 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3c943ee5adf70e2c9417e80d2d820d5d83e05dfcf556dd690bf80b0399ba30b" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.637038 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm" Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.753794 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:14:39 crc kubenswrapper[4790]: I0406 12:14:39.753872 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.083948 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-95748b946-k6fbn"] Apr 06 12:14:46 crc kubenswrapper[4790]: E0406 12:14:46.084788 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="extract" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.084805 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="extract" Apr 06 12:14:46 crc kubenswrapper[4790]: E0406 12:14:46.084818 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="pull" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.084843 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="pull" Apr 06 12:14:46 crc kubenswrapper[4790]: E0406 12:14:46.084861 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="util" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.084868 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="util" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.085000 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a436f45-6a0e-4102-8377-fa7299c0b3e8" containerName="extract" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.085521 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:46 crc kubenswrapper[4790]: W0406 12:14:46.087208 4790 reflector.go:561] object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-q9lt4": failed to list *v1.Secret: secrets "openstack-operator-controller-init-dockercfg-q9lt4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Apr 06 12:14:46 crc kubenswrapper[4790]: E0406 12:14:46.087261 4790 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-controller-init-dockercfg-q9lt4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-controller-init-dockercfg-q9lt4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.115800 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-95748b946-k6fbn"] Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.190354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55r5\" (UniqueName: \"kubernetes.io/projected/a7adf6fa-28d9-4655-b720-606ab4b91117-kube-api-access-l55r5\") pod \"openstack-operator-controller-init-95748b946-k6fbn\" (UID: \"a7adf6fa-28d9-4655-b720-606ab4b91117\") " pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.291457 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l55r5\" (UniqueName: \"kubernetes.io/projected/a7adf6fa-28d9-4655-b720-606ab4b91117-kube-api-access-l55r5\") pod \"openstack-operator-controller-init-95748b946-k6fbn\" (UID: \"a7adf6fa-28d9-4655-b720-606ab4b91117\") " pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:46 crc kubenswrapper[4790]: I0406 12:14:46.312428 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55r5\" (UniqueName: \"kubernetes.io/projected/a7adf6fa-28d9-4655-b720-606ab4b91117-kube-api-access-l55r5\") pod \"openstack-operator-controller-init-95748b946-k6fbn\" (UID: \"a7adf6fa-28d9-4655-b720-606ab4b91117\") " pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:47 crc kubenswrapper[4790]: I0406 12:14:47.209047 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-q9lt4" Apr 06 12:14:47 crc kubenswrapper[4790]: I0406 12:14:47.218456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:47 crc kubenswrapper[4790]: I0406 12:14:47.457957 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-95748b946-k6fbn"] Apr 06 12:14:47 crc kubenswrapper[4790]: W0406 12:14:47.467923 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7adf6fa_28d9_4655_b720_606ab4b91117.slice/crio-1ae89d6d0eeef3b4add3f8d9e35eed838f2d0df82f0d37637b308c4937728da0 WatchSource:0}: Error finding container 1ae89d6d0eeef3b4add3f8d9e35eed838f2d0df82f0d37637b308c4937728da0: Status 404 returned error can't find the container with id 1ae89d6d0eeef3b4add3f8d9e35eed838f2d0df82f0d37637b308c4937728da0 Apr 06 12:14:47 crc kubenswrapper[4790]: I0406 12:14:47.697172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" event={"ID":"a7adf6fa-28d9-4655-b720-606ab4b91117","Type":"ContainerStarted","Data":"1ae89d6d0eeef3b4add3f8d9e35eed838f2d0df82f0d37637b308c4937728da0"} Apr 06 12:14:51 crc kubenswrapper[4790]: I0406 12:14:51.736113 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" event={"ID":"a7adf6fa-28d9-4655-b720-606ab4b91117","Type":"ContainerStarted","Data":"c6dbfc4540c63bc3e790e185f4112c93186998d027162b1bbe396f92cafadef4"} Apr 06 12:14:51 crc kubenswrapper[4790]: I0406 12:14:51.736918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:14:51 crc kubenswrapper[4790]: I0406 12:14:51.770482 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" podStartSLOduration=2.581603919 podStartE2EDuration="5.770464677s" podCreationTimestamp="2026-04-06 12:14:46 +0000 UTC" firstStartedPulling="2026-04-06 12:14:47.470434484 +0000 UTC m=+1066.458177350" lastFinishedPulling="2026-04-06 12:14:50.659295242 +0000 UTC m=+1069.647038108" observedRunningTime="2026-04-06 12:14:51.768019451 +0000 UTC m=+1070.755762317" watchObservedRunningTime="2026-04-06 12:14:51.770464677 +0000 UTC m=+1070.758207543" Apr 06 12:14:57 crc kubenswrapper[4790]: I0406 12:14:57.225474 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-95748b946-k6fbn" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.132346 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv"] Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.133094 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.135168 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.135905 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.146564 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv"] Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.201653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.201698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xxm\" (UniqueName: \"kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.201719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.303213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xxm\" (UniqueName: \"kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.303261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.303336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.304639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.312812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.327770 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xxm\" (UniqueName: \"kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm\") pod \"collect-profiles-29591295-h9xmv\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.450846 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.646209 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv"] Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.801243 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" event={"ID":"f8d86720-ff28-4773-966f-21968cb6d7f4","Type":"ContainerStarted","Data":"c7251a5006b8db817acb0c3310f4fdc9420ce0adab47122519f0bab267d1de83"} Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.801305 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" event={"ID":"f8d86720-ff28-4773-966f-21968cb6d7f4","Type":"ContainerStarted","Data":"e9b9ea0245185881ad201a22aaa96056bc899fd9a2f926823c28d5e1b1372a13"} Apr 06 12:15:00 crc kubenswrapper[4790]: I0406 12:15:00.815463 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" podStartSLOduration=0.815438336 podStartE2EDuration="815.438336ms" podCreationTimestamp="2026-04-06 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:15:00.815137278 +0000 UTC m=+1079.802880164" watchObservedRunningTime="2026-04-06 12:15:00.815438336 +0000 UTC m=+1079.803181202" Apr 06 12:15:01 crc kubenswrapper[4790]: I0406 12:15:01.810421 4790 generic.go:334] "Generic (PLEG): container finished" podID="f8d86720-ff28-4773-966f-21968cb6d7f4" containerID="c7251a5006b8db817acb0c3310f4fdc9420ce0adab47122519f0bab267d1de83" exitCode=0 Apr 06 12:15:01 crc kubenswrapper[4790]: I0406 12:15:01.810706 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" event={"ID":"f8d86720-ff28-4773-966f-21968cb6d7f4","Type":"ContainerDied","Data":"c7251a5006b8db817acb0c3310f4fdc9420ce0adab47122519f0bab267d1de83"} Apr 06 12:15:02 crc kubenswrapper[4790]: I0406 12:15:02.991708 4790 scope.go:117] "RemoveContainer" containerID="9456a15313b46c7fb1911b9e91f811cf90ae1425ab434e20d38deab25e3bd0e4" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.190972 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.343050 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume\") pod \"f8d86720-ff28-4773-966f-21968cb6d7f4\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.343169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24xxm\" (UniqueName: \"kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm\") pod \"f8d86720-ff28-4773-966f-21968cb6d7f4\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.343311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume\") pod \"f8d86720-ff28-4773-966f-21968cb6d7f4\" (UID: \"f8d86720-ff28-4773-966f-21968cb6d7f4\") " Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.344247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8d86720-ff28-4773-966f-21968cb6d7f4" (UID: "f8d86720-ff28-4773-966f-21968cb6d7f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.351965 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8d86720-ff28-4773-966f-21968cb6d7f4" (UID: "f8d86720-ff28-4773-966f-21968cb6d7f4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.353636 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm" (OuterVolumeSpecName: "kube-api-access-24xxm") pod "f8d86720-ff28-4773-966f-21968cb6d7f4" (UID: "f8d86720-ff28-4773-966f-21968cb6d7f4"). InnerVolumeSpecName "kube-api-access-24xxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.444672 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8d86720-ff28-4773-966f-21968cb6d7f4-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.444732 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24xxm\" (UniqueName: \"kubernetes.io/projected/f8d86720-ff28-4773-966f-21968cb6d7f4-kube-api-access-24xxm\") on node \"crc\" DevicePath \"\"" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.444741 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8d86720-ff28-4773-966f-21968cb6d7f4-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.825752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" event={"ID":"f8d86720-ff28-4773-966f-21968cb6d7f4","Type":"ContainerDied","Data":"e9b9ea0245185881ad201a22aaa96056bc899fd9a2f926823c28d5e1b1372a13"} Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.825807 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b9ea0245185881ad201a22aaa96056bc899fd9a2f926823c28d5e1b1372a13" Apr 06 12:15:03 crc kubenswrapper[4790]: I0406 12:15:03.825850 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv" Apr 06 12:15:09 crc kubenswrapper[4790]: I0406 12:15:09.753551 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:15:09 crc kubenswrapper[4790]: I0406 12:15:09.754095 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:15:09 crc kubenswrapper[4790]: I0406 12:15:09.754151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:15:09 crc kubenswrapper[4790]: I0406 12:15:09.754886 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:15:09 crc kubenswrapper[4790]: I0406 12:15:09.754953 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43" gracePeriod=600 Apr 06 12:15:10 crc kubenswrapper[4790]: I0406 12:15:10.905333 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43" exitCode=0 Apr 06 12:15:10 crc kubenswrapper[4790]: I0406 12:15:10.905369 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43"} Apr 06 12:15:10 crc kubenswrapper[4790]: I0406 12:15:10.905874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6"} Apr 06 12:15:10 crc kubenswrapper[4790]: I0406 12:15:10.905903 4790 scope.go:117] "RemoveContainer" containerID="a600eeb7976d392fa3a056d87383315f86d322ed123278808846f0172ac67622" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.022945 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw"] Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.023801 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d86720-ff28-4773-966f-21968cb6d7f4" containerName="collect-profiles" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.023818 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d86720-ff28-4773-966f-21968cb6d7f4" containerName="collect-profiles" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.024006 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d86720-ff28-4773-966f-21968cb6d7f4" containerName="collect-profiles" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.024539 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.028148 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vbbkj" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.044917 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.045920 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.054154 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-k4mn9" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.059682 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.060669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.067974 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.069901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hnstm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.078585 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.103633 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.128896 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.129799 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.136914 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.137744 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.152429 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-fxwrr" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.152821 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qxfwx" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.155961 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.174076 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.184179 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.185070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.189549 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zb5jt" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.202019 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.208419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.213340 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wgx97" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.213534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.247719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpgqd\" (UniqueName: \"kubernetes.io/projected/5878c1d4-78cf-447f-b442-f7a9aa1aee99-kube-api-access-cpgqd\") pod \"cinder-operator-controller-manager-78674bbc6b-48jqq\" (UID: \"5878c1d4-78cf-447f-b442-f7a9aa1aee99\") " pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.247877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4kpj\" (UniqueName: \"kubernetes.io/projected/9430e8f1-17ec-4eff-8d9c-d54553956f8d-kube-api-access-h4kpj\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.247929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbf8\" (UniqueName: \"kubernetes.io/projected/a8b6d51d-4671-471f-94c6-b0a4b2c4a27d-kube-api-access-ngbf8\") pod \"heat-operator-controller-manager-5b5d8f8697-hwvv8\" (UID: \"a8b6d51d-4671-471f-94c6-b0a4b2c4a27d\") " pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.247983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcktb\" (UniqueName: \"kubernetes.io/projected/16027ea9-802c-43ef-80ac-e2f66a2cc36b-kube-api-access-zcktb\") pod \"glance-operator-controller-manager-8566787df9-l8dhs\" (UID: \"16027ea9-802c-43ef-80ac-e2f66a2cc36b\") " pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.248021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.257172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnn5\" (UniqueName: \"kubernetes.io/projected/c2958357-3518-4e74-8326-cfe8cf23334f-kube-api-access-wtnn5\") pod \"horizon-operator-controller-manager-6c5d8948dc-288vm\" (UID: \"c2958357-3518-4e74-8326-cfe8cf23334f\") " pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.257247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5mqn\" (UniqueName: \"kubernetes.io/projected/fb176575-b24c-4da4-a0f7-c5117d2c2ed7-kube-api-access-w5mqn\") pod \"designate-operator-controller-manager-58689c6fff-tm8b2\" (UID: \"fb176575-b24c-4da4-a0f7-c5117d2c2ed7\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.257343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5ws\" (UniqueName: \"kubernetes.io/projected/95b18d6e-ec5a-45e7-89c0-0f4618e4eb97-kube-api-access-tn5ws\") pod \"barbican-operator-controller-manager-5bcc684c66-wv5tw\" (UID: \"95b18d6e-ec5a-45e7-89c0-0f4618e4eb97\") " pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.266961 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.274780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.277285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rvbdr" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.314793 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.317275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.319974 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dr25w" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.346241 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.347168 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.351372 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-q8m2p" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.358611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4kpj\" (UniqueName: \"kubernetes.io/projected/9430e8f1-17ec-4eff-8d9c-d54553956f8d-kube-api-access-h4kpj\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.358653 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngbf8\" (UniqueName: \"kubernetes.io/projected/a8b6d51d-4671-471f-94c6-b0a4b2c4a27d-kube-api-access-ngbf8\") pod \"heat-operator-controller-manager-5b5d8f8697-hwvv8\" (UID: \"a8b6d51d-4671-471f-94c6-b0a4b2c4a27d\") " pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.358687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcktb\" (UniqueName: \"kubernetes.io/projected/16027ea9-802c-43ef-80ac-e2f66a2cc36b-kube-api-access-zcktb\") pod \"glance-operator-controller-manager-8566787df9-l8dhs\" (UID: \"16027ea9-802c-43ef-80ac-e2f66a2cc36b\") " pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.358706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.359811 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.359898 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert podName:9430e8f1-17ec-4eff-8d9c-d54553956f8d nodeName:}" failed. No retries permitted until 2026-04-06 12:15:16.859881999 +0000 UTC m=+1095.847624865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert") pod "infra-operator-controller-manager-88ccbfc66-9pp57" (UID: "9430e8f1-17ec-4eff-8d9c-d54553956f8d") : secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsc5\" (UniqueName: \"kubernetes.io/projected/ef128b3d-ea70-46cc-8928-0a557b6fbf5d-kube-api-access-hdsc5\") pod \"manila-operator-controller-manager-5fddf8d98f-qrxjw\" (UID: \"ef128b3d-ea70-46cc-8928-0a557b6fbf5d\") " pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnn5\" (UniqueName: \"kubernetes.io/projected/c2958357-3518-4e74-8326-cfe8cf23334f-kube-api-access-wtnn5\") pod \"horizon-operator-controller-manager-6c5d8948dc-288vm\" (UID: \"c2958357-3518-4e74-8326-cfe8cf23334f\") " pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5mqn\" (UniqueName: \"kubernetes.io/projected/fb176575-b24c-4da4-a0f7-c5117d2c2ed7-kube-api-access-w5mqn\") pod \"designate-operator-controller-manager-58689c6fff-tm8b2\" (UID: \"fb176575-b24c-4da4-a0f7-c5117d2c2ed7\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5lm\" (UniqueName: \"kubernetes.io/projected/75ce52d5-3320-40e1-8d64-42d12e2fa4c8-kube-api-access-jl5lm\") pod \"ironic-operator-controller-manager-6b9c989bb6-z8t6s\" (UID: \"75ce52d5-3320-40e1-8d64-42d12e2fa4c8\") " pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5ws\" (UniqueName: \"kubernetes.io/projected/95b18d6e-ec5a-45e7-89c0-0f4618e4eb97-kube-api-access-tn5ws\") pod \"barbican-operator-controller-manager-5bcc684c66-wv5tw\" (UID: \"95b18d6e-ec5a-45e7-89c0-0f4618e4eb97\") " pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpgqd\" (UniqueName: \"kubernetes.io/projected/5878c1d4-78cf-447f-b442-f7a9aa1aee99-kube-api-access-cpgqd\") pod \"cinder-operator-controller-manager-78674bbc6b-48jqq\" (UID: \"5878c1d4-78cf-447f-b442-f7a9aa1aee99\") " pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.360356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qfh\" (UniqueName: \"kubernetes.io/projected/30132a58-4c7d-4761-b73b-6d0ee27ea74e-kube-api-access-l7qfh\") pod \"keystone-operator-controller-manager-dbf8bb784-6vkf8\" (UID: \"30132a58-4c7d-4761-b73b-6d0ee27ea74e\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.389695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnn5\" (UniqueName: \"kubernetes.io/projected/c2958357-3518-4e74-8326-cfe8cf23334f-kube-api-access-wtnn5\") pod \"horizon-operator-controller-manager-6c5d8948dc-288vm\" (UID: \"c2958357-3518-4e74-8326-cfe8cf23334f\") " pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.390986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngbf8\" (UniqueName: \"kubernetes.io/projected/a8b6d51d-4671-471f-94c6-b0a4b2c4a27d-kube-api-access-ngbf8\") pod \"heat-operator-controller-manager-5b5d8f8697-hwvv8\" (UID: \"a8b6d51d-4671-471f-94c6-b0a4b2c4a27d\") " pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.391578 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcktb\" (UniqueName: \"kubernetes.io/projected/16027ea9-802c-43ef-80ac-e2f66a2cc36b-kube-api-access-zcktb\") pod \"glance-operator-controller-manager-8566787df9-l8dhs\" (UID: \"16027ea9-802c-43ef-80ac-e2f66a2cc36b\") " pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.394511 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4kpj\" (UniqueName: \"kubernetes.io/projected/9430e8f1-17ec-4eff-8d9c-d54553956f8d-kube-api-access-h4kpj\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.397733 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5ws\" (UniqueName: \"kubernetes.io/projected/95b18d6e-ec5a-45e7-89c0-0f4618e4eb97-kube-api-access-tn5ws\") pod \"barbican-operator-controller-manager-5bcc684c66-wv5tw\" (UID: \"95b18d6e-ec5a-45e7-89c0-0f4618e4eb97\") " pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.397941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpgqd\" (UniqueName: \"kubernetes.io/projected/5878c1d4-78cf-447f-b442-f7a9aa1aee99-kube-api-access-cpgqd\") pod \"cinder-operator-controller-manager-78674bbc6b-48jqq\" (UID: \"5878c1d4-78cf-447f-b442-f7a9aa1aee99\") " pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.404015 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.416665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.420553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5mqn\" (UniqueName: \"kubernetes.io/projected/fb176575-b24c-4da4-a0f7-c5117d2c2ed7-kube-api-access-w5mqn\") pod \"designate-operator-controller-manager-58689c6fff-tm8b2\" (UID: \"fb176575-b24c-4da4-a0f7-c5117d2c2ed7\") " pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.423200 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.433019 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.434117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.437249 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z8lkb" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.447936 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.454820 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.461467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5d6t\" (UniqueName: \"kubernetes.io/projected/b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8-kube-api-access-f5d6t\") pod \"mariadb-operator-controller-manager-765cb856bd-7vfjz\" (UID: \"b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8\") " pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.461572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsc5\" (UniqueName: \"kubernetes.io/projected/ef128b3d-ea70-46cc-8928-0a557b6fbf5d-kube-api-access-hdsc5\") pod \"manila-operator-controller-manager-5fddf8d98f-qrxjw\" (UID: \"ef128b3d-ea70-46cc-8928-0a557b6fbf5d\") " pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.461619 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5lm\" (UniqueName: \"kubernetes.io/projected/75ce52d5-3320-40e1-8d64-42d12e2fa4c8-kube-api-access-jl5lm\") pod \"ironic-operator-controller-manager-6b9c989bb6-z8t6s\" (UID: \"75ce52d5-3320-40e1-8d64-42d12e2fa4c8\") " pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.461654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qfh\" (UniqueName: \"kubernetes.io/projected/30132a58-4c7d-4761-b73b-6d0ee27ea74e-kube-api-access-l7qfh\") pod \"keystone-operator-controller-manager-dbf8bb784-6vkf8\" (UID: \"30132a58-4c7d-4761-b73b-6d0ee27ea74e\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.467051 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.467949 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.472746 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64744474b-lxkq7"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.474707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.478412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4mmpc" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.479328 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5nrsg" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.481807 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.483999 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.486121 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.490771 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5lm\" (UniqueName: \"kubernetes.io/projected/75ce52d5-3320-40e1-8d64-42d12e2fa4c8-kube-api-access-jl5lm\") pod \"ironic-operator-controller-manager-6b9c989bb6-z8t6s\" (UID: \"75ce52d5-3320-40e1-8d64-42d12e2fa4c8\") " pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.499178 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsc5\" (UniqueName: \"kubernetes.io/projected/ef128b3d-ea70-46cc-8928-0a557b6fbf5d-kube-api-access-hdsc5\") pod \"manila-operator-controller-manager-5fddf8d98f-qrxjw\" (UID: \"ef128b3d-ea70-46cc-8928-0a557b6fbf5d\") " pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.499283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64744474b-lxkq7"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.501350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qfh\" (UniqueName: \"kubernetes.io/projected/30132a58-4c7d-4761-b73b-6d0ee27ea74e-kube-api-access-l7qfh\") pod \"keystone-operator-controller-manager-dbf8bb784-6vkf8\" (UID: \"30132a58-4c7d-4761-b73b-6d0ee27ea74e\") " pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.502049 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.505984 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.507918 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.510179 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zncnw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.515561 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.528404 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.537232 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.538249 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.540109 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.541259 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dlpxw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.562950 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnvs9\" (UniqueName: \"kubernetes.io/projected/f9c22535-24c4-416f-98ef-fcd0299921c4-kube-api-access-xnvs9\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.562997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.563032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5d6t\" (UniqueName: \"kubernetes.io/projected/b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8-kube-api-access-f5d6t\") pod \"mariadb-operator-controller-manager-765cb856bd-7vfjz\" (UID: \"b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8\") " pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.563118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxzd\" (UniqueName: \"kubernetes.io/projected/6d9a1f3c-f00e-498c-ae3f-3af6c407d051-kube-api-access-8fxzd\") pod \"octavia-operator-controller-manager-7594f57946-dlxrd\" (UID: \"6d9a1f3c-f00e-498c-ae3f-3af6c407d051\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.563155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7p5\" (UniqueName: \"kubernetes.io/projected/82a102f6-1fb6-4f30-8f0e-d3c4352b187e-kube-api-access-4d7p5\") pod \"neutron-operator-controller-manager-9bdbb8fd8-r64xk\" (UID: \"82a102f6-1fb6-4f30-8f0e-d3c4352b187e\") " pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.563175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qhqp\" (UniqueName: \"kubernetes.io/projected/2f59b367-b3b9-467b-b190-5492ec84d98c-kube-api-access-8qhqp\") pod \"nova-operator-controller-manager-64744474b-lxkq7\" (UID: \"2f59b367-b3b9-467b-b190-5492ec84d98c\") " pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.566180 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.567319 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.569160 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-65qjq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.575815 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.578426 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.580037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ngmv6" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.585355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5d6t\" (UniqueName: \"kubernetes.io/projected/b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8-kube-api-access-f5d6t\") pod \"mariadb-operator-controller-manager-765cb856bd-7vfjz\" (UID: \"b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8\") " pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.585413 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.594755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.610733 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.611906 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.617330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.621534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.626302 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hrnhn" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.649167 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.665936 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnvs9\" (UniqueName: \"kubernetes.io/projected/f9c22535-24c4-416f-98ef-fcd0299921c4-kube-api-access-xnvs9\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666728 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl2p7\" (UniqueName: \"kubernetes.io/projected/30dc85a8-d293-4324-b4af-f3b7731a5060-kube-api-access-nl2p7\") pod \"ovn-operator-controller-manager-565fbbfdc9-msh7n\" (UID: \"30dc85a8-d293-4324-b4af-f3b7731a5060\") " pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666756 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsltw\" (UniqueName: \"kubernetes.io/projected/96d39030-fa50-4568-a068-af079a592dc0-kube-api-access-qsltw\") pod \"placement-operator-controller-manager-559d8fdb6b-9t6v2\" (UID: \"96d39030-fa50-4568-a068-af079a592dc0\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666882 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxzd\" (UniqueName: \"kubernetes.io/projected/6d9a1f3c-f00e-498c-ae3f-3af6c407d051-kube-api-access-8fxzd\") pod \"octavia-operator-controller-manager-7594f57946-dlxrd\" (UID: \"6d9a1f3c-f00e-498c-ae3f-3af6c407d051\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2g8\" (UniqueName: \"kubernetes.io/projected/c9e8a19f-ad0a-45ed-a45f-240d2e5d187b-kube-api-access-kv2g8\") pod \"swift-operator-controller-manager-5c4dd9cdf6-kjg4j\" (UID: \"c9e8a19f-ad0a-45ed-a45f-240d2e5d187b\") " pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7p5\" (UniqueName: \"kubernetes.io/projected/82a102f6-1fb6-4f30-8f0e-d3c4352b187e-kube-api-access-4d7p5\") pod \"neutron-operator-controller-manager-9bdbb8fd8-r64xk\" (UID: \"82a102f6-1fb6-4f30-8f0e-d3c4352b187e\") " pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.666988 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qhqp\" (UniqueName: \"kubernetes.io/projected/2f59b367-b3b9-467b-b190-5492ec84d98c-kube-api-access-8qhqp\") pod \"nova-operator-controller-manager-64744474b-lxkq7\" (UID: \"2f59b367-b3b9-467b-b190-5492ec84d98c\") " pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.667526 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.667572 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert podName:f9c22535-24c4-416f-98ef-fcd0299921c4 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:17.16755685 +0000 UTC m=+1096.155299716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert") pod "openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" (UID: "f9c22535-24c4-416f-98ef-fcd0299921c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.668883 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.669431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.673313 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.674608 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.693135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qhqp\" (UniqueName: \"kubernetes.io/projected/2f59b367-b3b9-467b-b190-5492ec84d98c-kube-api-access-8qhqp\") pod \"nova-operator-controller-manager-64744474b-lxkq7\" (UID: \"2f59b367-b3b9-467b-b190-5492ec84d98c\") " pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.701261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnvs9\" (UniqueName: \"kubernetes.io/projected/f9c22535-24c4-416f-98ef-fcd0299921c4-kube-api-access-xnvs9\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.722518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7p5\" (UniqueName: \"kubernetes.io/projected/82a102f6-1fb6-4f30-8f0e-d3c4352b187e-kube-api-access-4d7p5\") pod \"neutron-operator-controller-manager-9bdbb8fd8-r64xk\" (UID: \"82a102f6-1fb6-4f30-8f0e-d3c4352b187e\") " pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.723333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxzd\" (UniqueName: \"kubernetes.io/projected/6d9a1f3c-f00e-498c-ae3f-3af6c407d051-kube-api-access-8fxzd\") pod \"octavia-operator-controller-manager-7594f57946-dlxrd\" (UID: \"6d9a1f3c-f00e-498c-ae3f-3af6c407d051\") " pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.741071 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.742560 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.745991 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fw49w" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.752204 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.761421 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.801004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl2p7\" (UniqueName: \"kubernetes.io/projected/30dc85a8-d293-4324-b4af-f3b7731a5060-kube-api-access-nl2p7\") pod \"ovn-operator-controller-manager-565fbbfdc9-msh7n\" (UID: \"30dc85a8-d293-4324-b4af-f3b7731a5060\") " pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.801079 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsltw\" (UniqueName: \"kubernetes.io/projected/96d39030-fa50-4568-a068-af079a592dc0-kube-api-access-qsltw\") pod \"placement-operator-controller-manager-559d8fdb6b-9t6v2\" (UID: \"96d39030-fa50-4568-a068-af079a592dc0\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.801162 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/7a8fad23-b18e-4933-af57-3e06aee00225-kube-api-access-7tdmh\") pod \"telemetry-operator-controller-manager-569745d4d8-ddglf\" (UID: \"7a8fad23-b18e-4933-af57-3e06aee00225\") " pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.801190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2g8\" (UniqueName: \"kubernetes.io/projected/c9e8a19f-ad0a-45ed-a45f-240d2e5d187b-kube-api-access-kv2g8\") pod \"swift-operator-controller-manager-5c4dd9cdf6-kjg4j\" (UID: \"c9e8a19f-ad0a-45ed-a45f-240d2e5d187b\") " pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.826621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl2p7\" (UniqueName: \"kubernetes.io/projected/30dc85a8-d293-4324-b4af-f3b7731a5060-kube-api-access-nl2p7\") pod \"ovn-operator-controller-manager-565fbbfdc9-msh7n\" (UID: \"30dc85a8-d293-4324-b4af-f3b7731a5060\") " pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.830443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2g8\" (UniqueName: \"kubernetes.io/projected/c9e8a19f-ad0a-45ed-a45f-240d2e5d187b-kube-api-access-kv2g8\") pod \"swift-operator-controller-manager-5c4dd9cdf6-kjg4j\" (UID: \"c9e8a19f-ad0a-45ed-a45f-240d2e5d187b\") " pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.834028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsltw\" (UniqueName: \"kubernetes.io/projected/96d39030-fa50-4568-a068-af079a592dc0-kube-api-access-qsltw\") pod \"placement-operator-controller-manager-559d8fdb6b-9t6v2\" (UID: \"96d39030-fa50-4568-a068-af079a592dc0\") " pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.860126 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.861207 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.864815 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fdk8b" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.874984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.883527 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.885561 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.904343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.905590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.905648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/7a8fad23-b18e-4933-af57-3e06aee00225-kube-api-access-7tdmh\") pod \"telemetry-operator-controller-manager-569745d4d8-ddglf\" (UID: \"7a8fad23-b18e-4933-af57-3e06aee00225\") " pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.905872 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: E0406 12:15:16.905937 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert podName:9430e8f1-17ec-4eff-8d9c-d54553956f8d nodeName:}" failed. No retries permitted until 2026-04-06 12:15:17.905915567 +0000 UTC m=+1096.893658483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert") pod "infra-operator-controller-manager-88ccbfc66-9pp57" (UID: "9430e8f1-17ec-4eff-8d9c-d54553956f8d") : secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.910556 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.911647 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.914202 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8d85j" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.931502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/7a8fad23-b18e-4933-af57-3e06aee00225-kube-api-access-7tdmh\") pod \"telemetry-operator-controller-manager-569745d4d8-ddglf\" (UID: \"7a8fad23-b18e-4933-af57-3e06aee00225\") " pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.946422 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.954914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.962461 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2"] Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.963730 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.968429 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rgw7m" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.968616 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.971134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.974259 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Apr 06 12:15:16 crc kubenswrapper[4790]: I0406 12:15:16.996021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.007075 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.008324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.008607 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78mq\" (UniqueName: \"kubernetes.io/projected/d4596119-7d30-4307-8815-4355fc5ee6eb-kube-api-access-j78mq\") pod \"test-operator-controller-manager-56bf57d759-dq7bm\" (UID: \"d4596119-7d30-4307-8815-4355fc5ee6eb\") " pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.008700 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78cjd\" (UniqueName: \"kubernetes.io/projected/56eefd7b-c275-40a3-8772-03ffc350736e-kube-api-access-78cjd\") pod \"watcher-operator-controller-manager-75df5978c-vvf85\" (UID: \"56eefd7b-c275-40a3-8772-03ffc350736e\") " pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.011931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kxn6d" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.013441 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.013618 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.054308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.116988 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58mk\" (UniqueName: \"kubernetes.io/projected/bb09d720-75af-43a7-90dd-e497d2933183-kube-api-access-q58mk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d46gz\" (UID: \"bb09d720-75af-43a7-90dd-e497d2933183\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.117095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j78mq\" (UniqueName: \"kubernetes.io/projected/d4596119-7d30-4307-8815-4355fc5ee6eb-kube-api-access-j78mq\") pod \"test-operator-controller-manager-56bf57d759-dq7bm\" (UID: \"d4596119-7d30-4307-8815-4355fc5ee6eb\") " pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.117131 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b678b\" (UniqueName: \"kubernetes.io/projected/61665396-3382-4fa4-8d6a-706f47b2c5b0-kube-api-access-b678b\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.117947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78cjd\" (UniqueName: \"kubernetes.io/projected/56eefd7b-c275-40a3-8772-03ffc350736e-kube-api-access-78cjd\") pod \"watcher-operator-controller-manager-75df5978c-vvf85\" (UID: \"56eefd7b-c275-40a3-8772-03ffc350736e\") " pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.117985 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.118013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.154223 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.154953 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78mq\" (UniqueName: \"kubernetes.io/projected/d4596119-7d30-4307-8815-4355fc5ee6eb-kube-api-access-j78mq\") pod \"test-operator-controller-manager-56bf57d759-dq7bm\" (UID: \"d4596119-7d30-4307-8815-4355fc5ee6eb\") " pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.155086 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78cjd\" (UniqueName: \"kubernetes.io/projected/56eefd7b-c275-40a3-8772-03ffc350736e-kube-api-access-78cjd\") pod \"watcher-operator-controller-manager-75df5978c-vvf85\" (UID: \"56eefd7b-c275-40a3-8772-03ffc350736e\") " pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.196648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.219381 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.219428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.219473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.219496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q58mk\" (UniqueName: \"kubernetes.io/projected/bb09d720-75af-43a7-90dd-e497d2933183-kube-api-access-q58mk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d46gz\" (UID: \"bb09d720-75af-43a7-90dd-e497d2933183\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.219553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b678b\" (UniqueName: \"kubernetes.io/projected/61665396-3382-4fa4-8d6a-706f47b2c5b0-kube-api-access-b678b\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220308 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220371 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:17.720356962 +0000 UTC m=+1096.708099828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "metrics-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220504 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220527 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:17.720520067 +0000 UTC m=+1096.708262933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220562 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.220582 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert podName:f9c22535-24c4-416f-98ef-fcd0299921c4 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:18.220575428 +0000 UTC m=+1097.208318294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert") pod "openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" (UID: "f9c22535-24c4-416f-98ef-fcd0299921c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.236924 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.260386 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.269101 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.274217 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58mk\" (UniqueName: \"kubernetes.io/projected/bb09d720-75af-43a7-90dd-e497d2933183-kube-api-access-q58mk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-d46gz\" (UID: \"bb09d720-75af-43a7-90dd-e497d2933183\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.274487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b678b\" (UniqueName: \"kubernetes.io/projected/61665396-3382-4fa4-8d6a-706f47b2c5b0-kube-api-access-b678b\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.285637 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.299183 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw"] Apr 06 12:15:17 crc kubenswrapper[4790]: W0406 12:15:17.342247 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16027ea9_802c_43ef_80ac_e2f66a2cc36b.slice/crio-029a79dc1fceb9ce690e1bfad6b8421749256d3ee19c8924e5ee083f6752d944 WatchSource:0}: Error finding container 029a79dc1fceb9ce690e1bfad6b8421749256d3ee19c8924e5ee083f6752d944: Status 404 returned error can't find the container with id 029a79dc1fceb9ce690e1bfad6b8421749256d3ee19c8924e5ee083f6752d944 Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.394714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.522095 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.741327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.741391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.741621 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.741678 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:18.741659699 +0000 UTC m=+1097.729402575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.742054 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.742087 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:18.74207899 +0000 UTC m=+1097.729821856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "metrics-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.948451 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.948649 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: E0406 12:15:17.948706 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert podName:9430e8f1-17ec-4eff-8d9c-d54553956f8d nodeName:}" failed. No retries permitted until 2026-04-06 12:15:19.948689495 +0000 UTC m=+1098.936432361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert") pod "infra-operator-controller-manager-88ccbfc66-9pp57" (UID: "9430e8f1-17ec-4eff-8d9c-d54553956f8d") : secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.955863 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64744474b-lxkq7"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.961199 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8"] Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.967817 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2"] Apr 06 12:15:17 crc kubenswrapper[4790]: W0406 12:15:17.980353 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30132a58_4c7d_4761_b73b_6d0ee27ea74e.slice/crio-9e32cd86773c1063f9a7164d04acaaf0769ccf562b357f91ac9bd2f0146bb0b7 WatchSource:0}: Error finding container 9e32cd86773c1063f9a7164d04acaaf0769ccf562b357f91ac9bd2f0146bb0b7: Status 404 returned error can't find the container with id 9e32cd86773c1063f9a7164d04acaaf0769ccf562b357f91ac9bd2f0146bb0b7 Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.980433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" event={"ID":"95b18d6e-ec5a-45e7-89c0-0f4618e4eb97","Type":"ContainerStarted","Data":"a80e09df4008fbc4d357ad187cad164a812f807c24d741fd00b36c914b771023"} Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.982204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" event={"ID":"75ce52d5-3320-40e1-8d64-42d12e2fa4c8","Type":"ContainerStarted","Data":"a469c04313c15917eac79a22fc37a2f067ff3d0d777d75a5b7db4132d3e40d12"} Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.986453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" event={"ID":"ef128b3d-ea70-46cc-8928-0a557b6fbf5d","Type":"ContainerStarted","Data":"55e5dd7cf0b08f5ecaa00b68fd0e9381ce5e53b9c6a5e3d74d8e3d41d9fe80ef"} Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.989029 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" event={"ID":"a8b6d51d-4671-471f-94c6-b0a4b2c4a27d","Type":"ContainerStarted","Data":"3dc647b01e4979f38b5bc7afb6891163c996dd6437382d20ba18e5dbe7d410bd"} Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.990400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" event={"ID":"2f59b367-b3b9-467b-b190-5492ec84d98c","Type":"ContainerStarted","Data":"46bf38869af771bb464020d0457041c2b6a4a45d0d8e91939436dd819871cbdb"} Apr 06 12:15:17 crc kubenswrapper[4790]: I0406 12:15:17.992356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" event={"ID":"c2958357-3518-4e74-8326-cfe8cf23334f","Type":"ContainerStarted","Data":"2fb45a32411d0aa754489ecadcd4f6c2319ead7fe261dba445fce7b5d630c9c6"} Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.000092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" event={"ID":"16027ea9-802c-43ef-80ac-e2f66a2cc36b","Type":"ContainerStarted","Data":"029a79dc1fceb9ce690e1bfad6b8421749256d3ee19c8924e5ee083f6752d944"} Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.027668 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb176575_b24c_4da4_a0f7_c5117d2c2ed7.slice/crio-ab873a85a0bcb801cdc0fb1c22dc7a2b1b7c52b070f9733d946cf81039c2cd89 WatchSource:0}: Error finding container ab873a85a0bcb801cdc0fb1c22dc7a2b1b7c52b070f9733d946cf81039c2cd89: Status 404 returned error can't find the container with id ab873a85a0bcb801cdc0fb1c22dc7a2b1b7c52b070f9733d946cf81039c2cd89 Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.267185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.267393 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.267603 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert podName:f9c22535-24c4-416f-98ef-fcd0299921c4 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:20.267578841 +0000 UTC m=+1099.255321697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert") pod "openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" (UID: "f9c22535-24c4-416f-98ef-fcd0299921c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.354980 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk"] Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.373000 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.394068 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e8a19f_ad0a_45ed_a45f_240d2e5d187b.slice/crio-ff0b8ee6bde3bdc613011805f38ac3866cae83c2fd294007cdc05f7b015df795 WatchSource:0}: Error finding container ff0b8ee6bde3bdc613011805f38ac3866cae83c2fd294007cdc05f7b015df795: Status 404 returned error can't find the container with id ff0b8ee6bde3bdc613011805f38ac3866cae83c2fd294007cdc05f7b015df795 Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.396849 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.397705 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8fad23_b18e_4933_af57_3e06aee00225.slice/crio-db6883d004858473c908a263f4989ec67bf8d6ae16cd355f1234f89ee794dd21 WatchSource:0}: Error finding container db6883d004858473c908a263f4989ec67bf8d6ae16cd355f1234f89ee794dd21: Status 404 returned error can't find the container with id db6883d004858473c908a263f4989ec67bf8d6ae16cd355f1234f89ee794dd21 Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.400361 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5878c1d4_78cf_447f_b442_f7a9aa1aee99.slice/crio-a4122b55b6415db7a78e144e96168d66c3b3ebb28e826daf74ebafa41b7b52f8 WatchSource:0}: Error finding container a4122b55b6415db7a78e144e96168d66c3b3ebb28e826daf74ebafa41b7b52f8: Status 404 returned error can't find the container with id a4122b55b6415db7a78e144e96168d66c3b3ebb28e826daf74ebafa41b7b52f8 Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.403325 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n"] Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.408714 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.411013 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a102f6_1fb6_4f30_8f0e_d3c4352b187e.slice/crio-0ca8c06f7298749a0d1714e99750e6829adf572397119e9fd289f86706dc4743 WatchSource:0}: Error finding container 0ca8c06f7298749a0d1714e99750e6829adf572397119e9fd289f86706dc4743: Status 404 returned error can't find the container with id 0ca8c06f7298749a0d1714e99750e6829adf572397119e9fd289f86706dc4743 Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.417720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.425750 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30dc85a8_d293_4324_b4af_f3b7731a5060.slice/crio-de775dd3afd4a725a85f0c556363d102853939f9e10b76dbd3ff0efc8c23a3e2 WatchSource:0}: Error finding container de775dd3afd4a725a85f0c556363d102853939f9e10b76dbd3ff0efc8c23a3e2: Status 404 returned error can't find the container with id de775dd3afd4a725a85f0c556363d102853939f9e10b76dbd3ff0efc8c23a3e2 Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.426171 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4596119_7d30_4307_8815_4355fc5ee6eb.slice/crio-8f1a01de75eb94580e1665b837c19ae5282636500d0bed37a9444ff84ec589bb WatchSource:0}: Error finding container 8f1a01de75eb94580e1665b837c19ae5282636500d0bed37a9444ff84ec589bb: Status 404 returned error can't find the container with id 8f1a01de75eb94580e1665b837c19ae5282636500d0bed37a9444ff84ec589bb Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.426806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.433966 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56eefd7b_c275_40a3_8772_03ffc350736e.slice/crio-296b228e8468ab64b7a9173d400f9e008e6f8e95d91e4cd552dc154b5f1e5a5b WatchSource:0}: Error finding container 296b228e8468ab64b7a9173d400f9e008e6f8e95d91e4cd552dc154b5f1e5a5b: Status 404 returned error can't find the container with id 296b228e8468ab64b7a9173d400f9e008e6f8e95d91e4cd552dc154b5f1e5a5b Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.436588 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm"] Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.447958 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf"] Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.448465 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7dc180b65038365b45bff4f2745bf00301b77da4338356e68caefadb2978f7d0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f5d6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-765cb856bd-7vfjz_openstack-operators(b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.449440 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8fxzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7594f57946-dlxrd_openstack-operators(6d9a1f3c-f00e-498c-ae3f-3af6c407d051): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.449810 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" podUID="b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.451016 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" podUID="6d9a1f3c-f00e-498c-ae3f-3af6c407d051" Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.456939 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2"] Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.462989 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d39030_fa50_4568_a068_af079a592dc0.slice/crio-7ad72e877835177e35c99b1a84599d5bcdc67f460fdea7cd0c58ded8cb133b36 WatchSource:0}: Error finding container 7ad72e877835177e35c99b1a84599d5bcdc67f460fdea7cd0c58ded8cb133b36: Status 404 returned error can't find the container with id 7ad72e877835177e35c99b1a84599d5bcdc67f460fdea7cd0c58ded8cb133b36 Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.463079 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz"] Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.470746 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:96eade4f229c073e64fb9ff9c5a8479c93078b1007469ac1ea7d8135e1d29946,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qsltw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-559d8fdb6b-9t6v2_openstack-operators(96d39030-fa50-4568-a068-af079a592dc0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.472236 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" podUID="96d39030-fa50-4568-a068-af079a592dc0" Apr 06 12:15:18 crc kubenswrapper[4790]: W0406 12:15:18.479906 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb09d720_75af_43a7_90dd_e497d2933183.slice/crio-998cfd21966ae6fe144e8c042c05efe767d355047cd2d7b4801208e32b278863 WatchSource:0}: Error finding container 998cfd21966ae6fe144e8c042c05efe767d355047cd2d7b4801208e32b278863: Status 404 returned error can't find the container with id 998cfd21966ae6fe144e8c042c05efe767d355047cd2d7b4801208e32b278863 Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.490794 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q58mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-d46gz_openstack-operators(bb09d720-75af-43a7-90dd-e497d2933183): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.491988 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podUID="bb09d720-75af-43a7-90dd-e497d2933183" Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.781338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:18 crc kubenswrapper[4790]: I0406 12:15:18.781388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.781739 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.781800 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:20.781782925 +0000 UTC m=+1099.769525791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "webhook-server-cert" not found Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.782125 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 06 12:15:18 crc kubenswrapper[4790]: E0406 12:15:18.782180 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:20.782165065 +0000 UTC m=+1099.769907921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "metrics-server-cert" not found Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.010575 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" event={"ID":"96d39030-fa50-4568-a068-af079a592dc0","Type":"ContainerStarted","Data":"7ad72e877835177e35c99b1a84599d5bcdc67f460fdea7cd0c58ded8cb133b36"} Apr 06 12:15:19 crc kubenswrapper[4790]: E0406 12:15:19.015625 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:96eade4f229c073e64fb9ff9c5a8479c93078b1007469ac1ea7d8135e1d29946\\\"\"" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" podUID="96d39030-fa50-4568-a068-af079a592dc0" Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.031505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" event={"ID":"bb09d720-75af-43a7-90dd-e497d2933183","Type":"ContainerStarted","Data":"998cfd21966ae6fe144e8c042c05efe767d355047cd2d7b4801208e32b278863"} Apr 06 12:15:19 crc kubenswrapper[4790]: E0406 12:15:19.033017 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podUID="bb09d720-75af-43a7-90dd-e497d2933183" Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.042390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" event={"ID":"5878c1d4-78cf-447f-b442-f7a9aa1aee99","Type":"ContainerStarted","Data":"a4122b55b6415db7a78e144e96168d66c3b3ebb28e826daf74ebafa41b7b52f8"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.054937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" event={"ID":"d4596119-7d30-4307-8815-4355fc5ee6eb","Type":"ContainerStarted","Data":"8f1a01de75eb94580e1665b837c19ae5282636500d0bed37a9444ff84ec589bb"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.059627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" event={"ID":"56eefd7b-c275-40a3-8772-03ffc350736e","Type":"ContainerStarted","Data":"296b228e8468ab64b7a9173d400f9e008e6f8e95d91e4cd552dc154b5f1e5a5b"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.062069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" event={"ID":"fb176575-b24c-4da4-a0f7-c5117d2c2ed7","Type":"ContainerStarted","Data":"ab873a85a0bcb801cdc0fb1c22dc7a2b1b7c52b070f9733d946cf81039c2cd89"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.066397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" event={"ID":"6d9a1f3c-f00e-498c-ae3f-3af6c407d051","Type":"ContainerStarted","Data":"ae5933e2f2b8ccbff6621b375cb3c15c7970174c631807dd6a4d1cea2bd9f3b5"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.069843 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" event={"ID":"c9e8a19f-ad0a-45ed-a45f-240d2e5d187b","Type":"ContainerStarted","Data":"ff0b8ee6bde3bdc613011805f38ac3866cae83c2fd294007cdc05f7b015df795"} Apr 06 12:15:19 crc kubenswrapper[4790]: E0406 12:15:19.070685 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" podUID="6d9a1f3c-f00e-498c-ae3f-3af6c407d051" Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.072293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" event={"ID":"b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8","Type":"ContainerStarted","Data":"be4f9879d3d143b10832499f7b2c933b60cd1b697814992127237fa76a8745fb"} Apr 06 12:15:19 crc kubenswrapper[4790]: E0406 12:15:19.073526 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7dc180b65038365b45bff4f2745bf00301b77da4338356e68caefadb2978f7d0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" podUID="b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8" Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.076814 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" event={"ID":"30132a58-4c7d-4761-b73b-6d0ee27ea74e","Type":"ContainerStarted","Data":"9e32cd86773c1063f9a7164d04acaaf0769ccf562b357f91ac9bd2f0146bb0b7"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.078798 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" event={"ID":"30dc85a8-d293-4324-b4af-f3b7731a5060","Type":"ContainerStarted","Data":"de775dd3afd4a725a85f0c556363d102853939f9e10b76dbd3ff0efc8c23a3e2"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.089010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" event={"ID":"7a8fad23-b18e-4933-af57-3e06aee00225","Type":"ContainerStarted","Data":"db6883d004858473c908a263f4989ec67bf8d6ae16cd355f1234f89ee794dd21"} Apr 06 12:15:19 crc kubenswrapper[4790]: I0406 12:15:19.097998 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" event={"ID":"82a102f6-1fb6-4f30-8f0e-d3c4352b187e","Type":"ContainerStarted","Data":"0ca8c06f7298749a0d1714e99750e6829adf572397119e9fd289f86706dc4743"} Apr 06 12:15:20 crc kubenswrapper[4790]: I0406 12:15:20.001817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.002115 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.002301 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert podName:9430e8f1-17ec-4eff-8d9c-d54553956f8d nodeName:}" failed. No retries permitted until 2026-04-06 12:15:24.002280671 +0000 UTC m=+1102.990023537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert") pod "infra-operator-controller-manager-88ccbfc66-9pp57" (UID: "9430e8f1-17ec-4eff-8d9c-d54553956f8d") : secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.111360 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:96eade4f229c073e64fb9ff9c5a8479c93078b1007469ac1ea7d8135e1d29946\\\"\"" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" podUID="96d39030-fa50-4568-a068-af079a592dc0" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.112093 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podUID="bb09d720-75af-43a7-90dd-e497d2933183" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.112450 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7dc180b65038365b45bff4f2745bf00301b77da4338356e68caefadb2978f7d0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" podUID="b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.112849 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:b6d44a28b047f402b17b4cc07584f04cd6f1168d8742a9a8b17a9ce7c8550c5a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" podUID="6d9a1f3c-f00e-498c-ae3f-3af6c407d051" Apr 06 12:15:20 crc kubenswrapper[4790]: I0406 12:15:20.306585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.306863 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.306969 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert podName:f9c22535-24c4-416f-98ef-fcd0299921c4 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:24.306949141 +0000 UTC m=+1103.294692017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert") pod "openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" (UID: "f9c22535-24c4-416f-98ef-fcd0299921c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: I0406 12:15:20.818787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:20 crc kubenswrapper[4790]: I0406 12:15:20.818866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.819006 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.819065 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:24.819050177 +0000 UTC m=+1103.806793043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "webhook-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.819394 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 06 12:15:20 crc kubenswrapper[4790]: E0406 12:15:20.819438 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:24.819429138 +0000 UTC m=+1103.807172004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "metrics-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: I0406 12:15:24.069934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.070107 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.070414 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert podName:9430e8f1-17ec-4eff-8d9c-d54553956f8d nodeName:}" failed. No retries permitted until 2026-04-06 12:15:32.07039247 +0000 UTC m=+1111.058135346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert") pod "infra-operator-controller-manager-88ccbfc66-9pp57" (UID: "9430e8f1-17ec-4eff-8d9c-d54553956f8d") : secret "infra-operator-webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: I0406 12:15:24.374480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.374643 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.374686 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert podName:f9c22535-24c4-416f-98ef-fcd0299921c4 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:32.374674184 +0000 UTC m=+1111.362417050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert") pod "openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" (UID: "f9c22535-24c4-416f-98ef-fcd0299921c4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: I0406 12:15:24.883489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:24 crc kubenswrapper[4790]: I0406 12:15:24.883546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.883687 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.883701 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.883757 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:32.883737546 +0000 UTC m=+1111.871480412 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "webhook-server-cert" not found Apr 06 12:15:24 crc kubenswrapper[4790]: E0406 12:15:24.883800 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs podName:61665396-3382-4fa4-8d6a-706f47b2c5b0 nodeName:}" failed. No retries permitted until 2026-04-06 12:15:32.883775687 +0000 UTC m=+1111.871518623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs") pod "openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" (UID: "61665396-3382-4fa4-8d6a-706f47b2c5b0") : secret "metrics-server-cert" not found Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.177894 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:a456dad72589b5fc79badb0ebeafc8287884bb157e2d160ffbd40cb3ed124e40" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.178588 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:a456dad72589b5fc79badb0ebeafc8287884bb157e2d160ffbd40cb3ed124e40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7tdmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-569745d4d8-ddglf_openstack-operators(7a8fad23-b18e-4933-af57-3e06aee00225): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.179750 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" podUID="7a8fad23-b18e-4933-af57-3e06aee00225" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.192127 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:a456dad72589b5fc79badb0ebeafc8287884bb157e2d160ffbd40cb3ed124e40\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" podUID="7a8fad23-b18e-4933-af57-3e06aee00225" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.752410 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:0cdbbab5ebee13d8a2c0a27e72811736875f48659121adae9fef3299b4a7b709" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.752741 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0cdbbab5ebee13d8a2c0a27e72811736875f48659121adae9fef3299b4a7b709,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j78mq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56bf57d759-dq7bm_openstack-operators(d4596119-7d30-4307-8815-4355fc5ee6eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:30 crc kubenswrapper[4790]: E0406 12:15:30.753990 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" podUID="d4596119-7d30-4307-8815-4355fc5ee6eb" Apr 06 12:15:31 crc kubenswrapper[4790]: E0406 12:15:31.199234 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0cdbbab5ebee13d8a2c0a27e72811736875f48659121adae9fef3299b4a7b709\\\"\"" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" podUID="d4596119-7d30-4307-8815-4355fc5ee6eb" Apr 06 12:15:31 crc kubenswrapper[4790]: E0406 12:15:31.490389 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:823d909a1b2a9dc1878805b14a6067bcacc70f3edd741ec6cd887ae389d2f2c3" Apr 06 12:15:31 crc kubenswrapper[4790]: E0406 12:15:31.490542 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:823d909a1b2a9dc1878805b14a6067bcacc70f3edd741ec6cd887ae389d2f2c3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jl5lm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6b9c989bb6-z8t6s_openstack-operators(75ce52d5-3320-40e1-8d64-42d12e2fa4c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:31 crc kubenswrapper[4790]: E0406 12:15:31.491851 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" podUID="75ce52d5-3320-40e1-8d64-42d12e2fa4c8" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.093725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.100279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9430e8f1-17ec-4eff-8d9c-d54553956f8d-cert\") pod \"infra-operator-controller-manager-88ccbfc66-9pp57\" (UID: \"9430e8f1-17ec-4eff-8d9c-d54553956f8d\") " pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.130710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.133324 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:c8da57bc36a20e2052e708ab23a28a47d066ecab46e556afb5e4d238ccecd414" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.133500 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c8da57bc36a20e2052e708ab23a28a47d066ecab46e556afb5e4d238ccecd414,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8qhqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64744474b-lxkq7_openstack-operators(2f59b367-b3b9-467b-b190-5492ec84d98c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.134677 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" podUID="2f59b367-b3b9-467b-b190-5492ec84d98c" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.199470 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/openstack-k8s-operators/watcher-operator:9ae53cce0267b4096d64955b97bc1ba380834405" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.199535 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/openstack-k8s-operators/watcher-operator:9ae53cce0267b4096d64955b97bc1ba380834405" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.199693 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.94:5001/openstack-k8s-operators/watcher-operator:9ae53cce0267b4096d64955b97bc1ba380834405,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78cjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75df5978c-vvf85_openstack-operators(56eefd7b-c275-40a3-8772-03ffc350736e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.201140 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" podUID="56eefd7b-c275-40a3-8772-03ffc350736e" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.221591 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c8da57bc36a20e2052e708ab23a28a47d066ecab46e556afb5e4d238ccecd414\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" podUID="2f59b367-b3b9-467b-b190-5492ec84d98c" Apr 06 12:15:32 crc kubenswrapper[4790]: E0406 12:15:32.221638 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:823d909a1b2a9dc1878805b14a6067bcacc70f3edd741ec6cd887ae389d2f2c3\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" podUID="75ce52d5-3320-40e1-8d64-42d12e2fa4c8" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.397414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.402806 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c22535-24c4-416f-98ef-fcd0299921c4-cert\") pod \"openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw\" (UID: \"f9c22535-24c4-416f-98ef-fcd0299921c4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.546031 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.905147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.905634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.914568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-webhook-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.918518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/61665396-3382-4fa4-8d6a-706f47b2c5b0-metrics-certs\") pod \"openstack-operator-controller-manager-5d8c8cd5bb-5w2d2\" (UID: \"61665396-3382-4fa4-8d6a-706f47b2c5b0\") " pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:32 crc kubenswrapper[4790]: I0406 12:15:32.983413 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.096142 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw"] Apr 06 12:15:33 crc kubenswrapper[4790]: W0406 12:15:33.118336 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c22535_24c4_416f_98ef_fcd0299921c4.slice/crio-e8d36d3b5be87a9ddf7a5a1312252af62a9baebe0fa39e1df3c54163dde06904 WatchSource:0}: Error finding container e8d36d3b5be87a9ddf7a5a1312252af62a9baebe0fa39e1df3c54163dde06904: Status 404 returned error can't find the container with id e8d36d3b5be87a9ddf7a5a1312252af62a9baebe0fa39e1df3c54163dde06904 Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.218255 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" event={"ID":"f9c22535-24c4-416f-98ef-fcd0299921c4","Type":"ContainerStarted","Data":"e8d36d3b5be87a9ddf7a5a1312252af62a9baebe0fa39e1df3c54163dde06904"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.219414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" event={"ID":"c2958357-3518-4e74-8326-cfe8cf23334f","Type":"ContainerStarted","Data":"edd353a83195d4c716d2c7d018a68d2ae6cf961e39a7ad538a13de20b6dd7cba"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.220196 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.220345 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57"] Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.278353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" event={"ID":"16027ea9-802c-43ef-80ac-e2f66a2cc36b","Type":"ContainerStarted","Data":"9ebc9206c889f8c8b19d3d5329142bd5486425ba1072ba0778f6300ef4bb5551"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.278797 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:33 crc kubenswrapper[4790]: W0406 12:15:33.278859 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9430e8f1_17ec_4eff_8d9c_d54553956f8d.slice/crio-229dec2009870b9b84a24c03f87fb2c959a214c8d3d4d72a5790693844787388 WatchSource:0}: Error finding container 229dec2009870b9b84a24c03f87fb2c959a214c8d3d4d72a5790693844787388: Status 404 returned error can't find the container with id 229dec2009870b9b84a24c03f87fb2c959a214c8d3d4d72a5790693844787388 Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.280360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" event={"ID":"82a102f6-1fb6-4f30-8f0e-d3c4352b187e","Type":"ContainerStarted","Data":"b2521017c5ecd2719201def4b9b0578fd74f2bc5c1f9552effae6427f01caa1f"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.280668 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.290588 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" podStartSLOduration=2.473506802 podStartE2EDuration="17.290572633s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.359281398 +0000 UTC m=+1096.347024264" lastFinishedPulling="2026-04-06 12:15:32.176347229 +0000 UTC m=+1111.164090095" observedRunningTime="2026-04-06 12:15:33.288398597 +0000 UTC m=+1112.276141463" watchObservedRunningTime="2026-04-06 12:15:33.290572633 +0000 UTC m=+1112.278315499" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.295227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" event={"ID":"95b18d6e-ec5a-45e7-89c0-0f4618e4eb97","Type":"ContainerStarted","Data":"4cfb5df462bcdfe43106b97197568b5e9c8e573f263fb3c59c5dcfb9ef503adb"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.298409 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.307460 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" event={"ID":"ef128b3d-ea70-46cc-8928-0a557b6fbf5d","Type":"ContainerStarted","Data":"a20e76bb6eceebcdc34068ad612ae3309ffd65ea104f21818019c25b7f42b5b7"} Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.307575 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.317595 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" podStartSLOduration=3.017606568 podStartE2EDuration="17.317580098s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.417224768 +0000 UTC m=+1097.404967634" lastFinishedPulling="2026-04-06 12:15:32.717198298 +0000 UTC m=+1111.704941164" observedRunningTime="2026-04-06 12:15:33.314839089 +0000 UTC m=+1112.302581965" watchObservedRunningTime="2026-04-06 12:15:33.317580098 +0000 UTC m=+1112.305322964" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.349890 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.350260 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" podStartSLOduration=2.030447892 podStartE2EDuration="17.350238438s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.396758006 +0000 UTC m=+1096.384500872" lastFinishedPulling="2026-04-06 12:15:32.716548552 +0000 UTC m=+1111.704291418" observedRunningTime="2026-04-06 12:15:33.341891006 +0000 UTC m=+1112.329633882" watchObservedRunningTime="2026-04-06 12:15:33.350238438 +0000 UTC m=+1112.337981304" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.381360 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" podStartSLOduration=2.375571518 podStartE2EDuration="17.381342307s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.712467556 +0000 UTC m=+1096.700210422" lastFinishedPulling="2026-04-06 12:15:32.718238345 +0000 UTC m=+1111.705981211" observedRunningTime="2026-04-06 12:15:33.367532547 +0000 UTC m=+1112.355275413" watchObservedRunningTime="2026-04-06 12:15:33.381342307 +0000 UTC m=+1112.369085173" Apr 06 12:15:33 crc kubenswrapper[4790]: E0406 12:15:33.387017 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/openstack-k8s-operators/watcher-operator:9ae53cce0267b4096d64955b97bc1ba380834405\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" podUID="56eefd7b-c275-40a3-8772-03ffc350736e" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.403735 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" podStartSLOduration=2.046273076 podStartE2EDuration="17.403717345s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.359134614 +0000 UTC m=+1096.346877480" lastFinishedPulling="2026-04-06 12:15:32.716578883 +0000 UTC m=+1111.704321749" observedRunningTime="2026-04-06 12:15:33.403429438 +0000 UTC m=+1112.391172304" watchObservedRunningTime="2026-04-06 12:15:33.403717345 +0000 UTC m=+1112.391460211" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.449162 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" podStartSLOduration=1.874209988 podStartE2EDuration="17.449148368s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.141636793 +0000 UTC m=+1096.129379649" lastFinishedPulling="2026-04-06 12:15:32.716575163 +0000 UTC m=+1111.704318029" observedRunningTime="2026-04-06 12:15:33.447193179 +0000 UTC m=+1112.434936045" watchObservedRunningTime="2026-04-06 12:15:33.449148368 +0000 UTC m=+1112.436891234" Apr 06 12:15:33 crc kubenswrapper[4790]: I0406 12:15:33.735823 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2"] Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.401881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" event={"ID":"9430e8f1-17ec-4eff-8d9c-d54553956f8d","Type":"ContainerStarted","Data":"229dec2009870b9b84a24c03f87fb2c959a214c8d3d4d72a5790693844787388"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.404519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" event={"ID":"a8b6d51d-4671-471f-94c6-b0a4b2c4a27d","Type":"ContainerStarted","Data":"6009f502446812f5f6ea4358a01de9ef6d78f3d85d8688bd80bebea250aeba3e"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.413561 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" event={"ID":"c9e8a19f-ad0a-45ed-a45f-240d2e5d187b","Type":"ContainerStarted","Data":"c6716889a36adf575f83043be6c3213fe6bbc48b4e3edf51dfbb50deda06232a"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.414392 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.416248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" event={"ID":"5878c1d4-78cf-447f-b442-f7a9aa1aee99","Type":"ContainerStarted","Data":"eabd2ec37c41a78b3764ed651bcb6a82f9fde44222428c74a59cf633c4ee45f5"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.416650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.418054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" event={"ID":"61665396-3382-4fa4-8d6a-706f47b2c5b0","Type":"ContainerStarted","Data":"4cb2497a5d3d9989c547dceed33a28add78a3394673221f6ff5478703acdca61"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.418080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" event={"ID":"61665396-3382-4fa4-8d6a-706f47b2c5b0","Type":"ContainerStarted","Data":"236f94441f775f41f224f5dde5bb91f3abb09c88a3d95f554caf129c272d8932"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.418404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.424294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" event={"ID":"30132a58-4c7d-4761-b73b-6d0ee27ea74e","Type":"ContainerStarted","Data":"305471fde4781565c7ab6e58cd146b172286ea37f009217ef89640217122d0d6"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.424817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.426692 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" event={"ID":"fb176575-b24c-4da4-a0f7-c5117d2c2ed7","Type":"ContainerStarted","Data":"e385065832011e49bbc51d435dc5d10c38dca23e79baa69dc056984533a7c66c"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.426843 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.434607 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" podStartSLOduration=4.058336069 podStartE2EDuration="18.434595612s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.397283806 +0000 UTC m=+1097.385026672" lastFinishedPulling="2026-04-06 12:15:32.773543339 +0000 UTC m=+1111.761286215" observedRunningTime="2026-04-06 12:15:34.432791396 +0000 UTC m=+1113.420534252" watchObservedRunningTime="2026-04-06 12:15:34.434595612 +0000 UTC m=+1113.422338478" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.448013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" event={"ID":"30dc85a8-d293-4324-b4af-f3b7731a5060","Type":"ContainerStarted","Data":"d6b9ae205c0d3863962e921506caec69367ddcf557cd220d9c8508c7e80da23d"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.448690 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.473881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" event={"ID":"b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8","Type":"ContainerStarted","Data":"433e552e90f84069f3b41c9f52fa57e658989e63d4857a8f2cdf88b0e11c76b0"} Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.491955 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" podStartSLOduration=18.491933597 podStartE2EDuration="18.491933597s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:15:34.487150476 +0000 UTC m=+1113.474893342" watchObservedRunningTime="2026-04-06 12:15:34.491933597 +0000 UTC m=+1113.479676463" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.510160 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" podStartSLOduration=4.167692562 podStartE2EDuration="18.510140419s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.413087865 +0000 UTC m=+1097.400830731" lastFinishedPulling="2026-04-06 12:15:32.755535712 +0000 UTC m=+1111.743278588" observedRunningTime="2026-04-06 12:15:34.507215655 +0000 UTC m=+1113.494958541" watchObservedRunningTime="2026-04-06 12:15:34.510140419 +0000 UTC m=+1113.497883285" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.537351 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" podStartSLOduration=4.3912845130000004 podStartE2EDuration="18.53733126s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.030117048 +0000 UTC m=+1097.017859914" lastFinishedPulling="2026-04-06 12:15:32.176163795 +0000 UTC m=+1111.163906661" observedRunningTime="2026-04-06 12:15:34.527257664 +0000 UTC m=+1113.515000530" watchObservedRunningTime="2026-04-06 12:15:34.53733126 +0000 UTC m=+1113.525074126" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.568572 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" podStartSLOduration=3.778641448 podStartE2EDuration="18.568545172s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.983608274 +0000 UTC m=+1096.971351140" lastFinishedPulling="2026-04-06 12:15:32.773511998 +0000 UTC m=+1111.761254864" observedRunningTime="2026-04-06 12:15:34.555095321 +0000 UTC m=+1113.542838187" watchObservedRunningTime="2026-04-06 12:15:34.568545172 +0000 UTC m=+1113.556288038" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.586069 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" podStartSLOduration=4.266888991 podStartE2EDuration="18.586052996s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.435292199 +0000 UTC m=+1097.423035065" lastFinishedPulling="2026-04-06 12:15:32.754456204 +0000 UTC m=+1111.742199070" observedRunningTime="2026-04-06 12:15:34.581689696 +0000 UTC m=+1113.569432562" watchObservedRunningTime="2026-04-06 12:15:34.586052996 +0000 UTC m=+1113.573795862" Apr 06 12:15:34 crc kubenswrapper[4790]: I0406 12:15:34.601267 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" podStartSLOduration=4.262897572 podStartE2EDuration="18.601251712s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.448263261 +0000 UTC m=+1097.436006127" lastFinishedPulling="2026-04-06 12:15:32.786617401 +0000 UTC m=+1111.774360267" observedRunningTime="2026-04-06 12:15:34.599176789 +0000 UTC m=+1113.586919665" watchObservedRunningTime="2026-04-06 12:15:34.601251712 +0000 UTC m=+1113.588994578" Apr 06 12:15:36 crc kubenswrapper[4790]: I0406 12:15:36.752910 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.533235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" event={"ID":"96d39030-fa50-4568-a068-af079a592dc0","Type":"ContainerStarted","Data":"b5fb5f09ff492f07e88b2c1e906a20d9ed87162d846a0bf91a131ab0d45dce93"} Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.533812 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.535221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" event={"ID":"9430e8f1-17ec-4eff-8d9c-d54553956f8d","Type":"ContainerStarted","Data":"8ad3b686f95d566877fab1c6363ef189245c85f6a5814a939cb5de49e895abdf"} Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.535290 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.536885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" event={"ID":"f9c22535-24c4-416f-98ef-fcd0299921c4","Type":"ContainerStarted","Data":"add356ce11f47a3183d2fe92bdf8dae12117957ef6665769297c194bafb2145c"} Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.537172 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.562097 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" podStartSLOduration=3.652614422 podStartE2EDuration="22.562079153s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.470554017 +0000 UTC m=+1097.458296883" lastFinishedPulling="2026-04-06 12:15:37.380018748 +0000 UTC m=+1116.367761614" observedRunningTime="2026-04-06 12:15:38.552666554 +0000 UTC m=+1117.540409440" watchObservedRunningTime="2026-04-06 12:15:38.562079153 +0000 UTC m=+1117.549822019" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.589750 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" podStartSLOduration=18.508466586 podStartE2EDuration="22.589722664s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:33.306027165 +0000 UTC m=+1112.293770031" lastFinishedPulling="2026-04-06 12:15:37.387283243 +0000 UTC m=+1116.375026109" observedRunningTime="2026-04-06 12:15:38.57458334 +0000 UTC m=+1117.562326216" watchObservedRunningTime="2026-04-06 12:15:38.589722664 +0000 UTC m=+1117.577465530" Apr 06 12:15:38 crc kubenswrapper[4790]: I0406 12:15:38.605007 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" podStartSLOduration=18.321335437 podStartE2EDuration="22.604989802s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:33.121499741 +0000 UTC m=+1112.109242607" lastFinishedPulling="2026-04-06 12:15:37.405154106 +0000 UTC m=+1116.392896972" observedRunningTime="2026-04-06 12:15:38.604735785 +0000 UTC m=+1117.592478651" watchObservedRunningTime="2026-04-06 12:15:38.604989802 +0000 UTC m=+1117.592732668" Apr 06 12:15:42 crc kubenswrapper[4790]: I0406 12:15:42.136319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-88ccbfc66-9pp57" Apr 06 12:15:42 crc kubenswrapper[4790]: I0406 12:15:42.553137 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw" Apr 06 12:15:42 crc kubenswrapper[4790]: I0406 12:15:42.989993 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5d8c8cd5bb-5w2d2" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.486478 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8566787df9-l8dhs" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.488627 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b5d8f8697-hwvv8" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.504589 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6c5d8948dc-288vm" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.652138 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bcc684c66-wv5tw" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.672928 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-dbf8bb784-6vkf8" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.674186 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-78674bbc6b-48jqq" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.674841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fddf8d98f-qrxjw" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.677122 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58689c6fff-tm8b2" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.765177 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-765cb856bd-7vfjz" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.879937 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-9bdbb8fd8-r64xk" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.958846 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-565fbbfdc9-msh7n" Apr 06 12:15:46 crc kubenswrapper[4790]: I0406 12:15:46.974750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-559d8fdb6b-9t6v2" Apr 06 12:15:47 crc kubenswrapper[4790]: I0406 12:15:47.018571 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c4dd9cdf6-kjg4j" Apr 06 12:15:49 crc kubenswrapper[4790]: E0406 12:15:49.394084 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Apr 06 12:15:49 crc kubenswrapper[4790]: E0406 12:15:49.394500 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q58mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-d46gz_openstack-operators(bb09d720-75af-43a7-90dd-e497d2933183): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:15:49 crc kubenswrapper[4790]: E0406 12:15:49.395697 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podUID="bb09d720-75af-43a7-90dd-e497d2933183" Apr 06 12:15:49 crc kubenswrapper[4790]: I0406 12:15:49.631519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" event={"ID":"6d9a1f3c-f00e-498c-ae3f-3af6c407d051","Type":"ContainerStarted","Data":"b4624580bd73df6b3859508dfacbba72ffa521416b4d628ddd85d17d526f87ae"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.640066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" event={"ID":"7a8fad23-b18e-4933-af57-3e06aee00225","Type":"ContainerStarted","Data":"fd33573db5b39cbcb403d8d41f9da1a6579134472d2d014a3c57d508f6b1310f"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.640553 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.642303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" event={"ID":"2f59b367-b3b9-467b-b190-5492ec84d98c","Type":"ContainerStarted","Data":"0413fb8cb45f449c0f486f1defe0e05df46502697c0dead8962ff40895165b40"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.642655 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.644471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" event={"ID":"75ce52d5-3320-40e1-8d64-42d12e2fa4c8","Type":"ContainerStarted","Data":"6db6d172fa4535f5f65d6506dfdd7a32dbc87e529a4c26166454be38f66b4333"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.644791 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.645954 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" event={"ID":"d4596119-7d30-4307-8815-4355fc5ee6eb","Type":"ContainerStarted","Data":"10fbd604ee9ee36b8e6d82e05fa708a4c0fba506ef7964378a34673840af8f57"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.646297 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.647969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" event={"ID":"56eefd7b-c275-40a3-8772-03ffc350736e","Type":"ContainerStarted","Data":"d2c95f14b7e844172bab106260916781ec9a802e108e45e25d70a85d6ad4c093"} Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.648326 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.648404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.665593 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" podStartSLOduration=2.8928426910000002 podStartE2EDuration="34.665576514s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.416972261 +0000 UTC m=+1097.404715127" lastFinishedPulling="2026-04-06 12:15:50.189706084 +0000 UTC m=+1129.177448950" observedRunningTime="2026-04-06 12:15:50.657424067 +0000 UTC m=+1129.645166933" watchObservedRunningTime="2026-04-06 12:15:50.665576514 +0000 UTC m=+1129.653319380" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.679033 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" podStartSLOduration=3.126708525 podStartE2EDuration="34.679013575s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.438407893 +0000 UTC m=+1097.426150759" lastFinishedPulling="2026-04-06 12:15:49.990712943 +0000 UTC m=+1128.978455809" observedRunningTime="2026-04-06 12:15:50.676630655 +0000 UTC m=+1129.664373511" watchObservedRunningTime="2026-04-06 12:15:50.679013575 +0000 UTC m=+1129.666756441" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.692269 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" podStartSLOduration=2.561836263 podStartE2EDuration="34.692253631s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.967331082 +0000 UTC m=+1096.955073948" lastFinishedPulling="2026-04-06 12:15:50.09774845 +0000 UTC m=+1129.085491316" observedRunningTime="2026-04-06 12:15:50.688850015 +0000 UTC m=+1129.676592871" watchObservedRunningTime="2026-04-06 12:15:50.692253631 +0000 UTC m=+1129.679996497" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.710277 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" podStartSLOduration=3.25522303 podStartE2EDuration="34.710262848s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.43461792 +0000 UTC m=+1097.422360786" lastFinishedPulling="2026-04-06 12:15:49.889657738 +0000 UTC m=+1128.877400604" observedRunningTime="2026-04-06 12:15:50.707104818 +0000 UTC m=+1129.694847684" watchObservedRunningTime="2026-04-06 12:15:50.710262848 +0000 UTC m=+1129.698005714" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.733385 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" podStartSLOduration=4.109320416 podStartE2EDuration="34.733369465s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.44859846 +0000 UTC m=+1097.436341326" lastFinishedPulling="2026-04-06 12:15:49.072647509 +0000 UTC m=+1128.060390375" observedRunningTime="2026-04-06 12:15:50.732060212 +0000 UTC m=+1129.719803078" watchObservedRunningTime="2026-04-06 12:15:50.733369465 +0000 UTC m=+1129.721112331" Apr 06 12:15:50 crc kubenswrapper[4790]: I0406 12:15:50.749480 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" podStartSLOduration=2.15117951 podStartE2EDuration="34.749462053s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:17.259298781 +0000 UTC m=+1096.247041637" lastFinishedPulling="2026-04-06 12:15:49.857581294 +0000 UTC m=+1128.845324180" observedRunningTime="2026-04-06 12:15:50.749392502 +0000 UTC m=+1129.737135368" watchObservedRunningTime="2026-04-06 12:15:50.749462053 +0000 UTC m=+1129.737204919" Apr 06 12:15:56 crc kubenswrapper[4790]: I0406 12:15:56.623945 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b9c989bb6-z8t6s" Apr 06 12:15:56 crc kubenswrapper[4790]: I0406 12:15:56.888296 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64744474b-lxkq7" Apr 06 12:15:56 crc kubenswrapper[4790]: I0406 12:15:56.910302 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7594f57946-dlxrd" Apr 06 12:15:57 crc kubenswrapper[4790]: I0406 12:15:57.158258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-569745d4d8-ddglf" Apr 06 12:15:57 crc kubenswrapper[4790]: I0406 12:15:57.240555 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56bf57d759-dq7bm" Apr 06 12:15:57 crc kubenswrapper[4790]: I0406 12:15:57.263170 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75df5978c-vvf85" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.144155 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591296-klcbk"] Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.145722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.147700 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.148246 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.148329 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.155350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8rm5\" (UniqueName: \"kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5\") pod \"auto-csr-approver-29591296-klcbk\" (UID: \"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf\") " pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.159488 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591296-klcbk"] Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.256464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8rm5\" (UniqueName: \"kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5\") pod \"auto-csr-approver-29591296-klcbk\" (UID: \"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf\") " pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.275647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8rm5\" (UniqueName: \"kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5\") pod \"auto-csr-approver-29591296-klcbk\" (UID: \"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf\") " pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.461612 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.685901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591296-klcbk"] Apr 06 12:16:00 crc kubenswrapper[4790]: W0406 12:16:00.690714 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e2691a_d43d_4851_89f9_8d4aefbaa5cf.slice/crio-f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26 WatchSource:0}: Error finding container f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26: Status 404 returned error can't find the container with id f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26 Apr 06 12:16:00 crc kubenswrapper[4790]: I0406 12:16:00.719677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591296-klcbk" event={"ID":"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf","Type":"ContainerStarted","Data":"f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26"} Apr 06 12:16:02 crc kubenswrapper[4790]: I0406 12:16:02.745623 4790 generic.go:334] "Generic (PLEG): container finished" podID="c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" containerID="6bc7ebb49b088808465084372b5a7d7304c3f8ced1fab0a7415fd96383ec1aae" exitCode=0 Apr 06 12:16:02 crc kubenswrapper[4790]: I0406 12:16:02.745729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591296-klcbk" event={"ID":"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf","Type":"ContainerDied","Data":"6bc7ebb49b088808465084372b5a7d7304c3f8ced1fab0a7415fd96383ec1aae"} Apr 06 12:16:03 crc kubenswrapper[4790]: E0406 12:16:03.677330 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podUID="bb09d720-75af-43a7-90dd-e497d2933183" Apr 06 12:16:03 crc kubenswrapper[4790]: I0406 12:16:03.999267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.107547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8rm5\" (UniqueName: \"kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5\") pod \"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf\" (UID: \"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf\") " Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.113579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5" (OuterVolumeSpecName: "kube-api-access-x8rm5") pod "c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" (UID: "c7e2691a-d43d-4851-89f9-8d4aefbaa5cf"). InnerVolumeSpecName "kube-api-access-x8rm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.209454 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8rm5\" (UniqueName: \"kubernetes.io/projected/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf-kube-api-access-x8rm5\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.763906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591296-klcbk" event={"ID":"c7e2691a-d43d-4851-89f9-8d4aefbaa5cf","Type":"ContainerDied","Data":"f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26"} Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.763954 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1ba19540a9657267e8ab24c02de51c2a16c2ed3ced43e06936ff2fcec9e5a26" Apr 06 12:16:04 crc kubenswrapper[4790]: I0406 12:16:04.764011 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591296-klcbk" Apr 06 12:16:05 crc kubenswrapper[4790]: I0406 12:16:05.074428 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591290-mqqrk"] Apr 06 12:16:05 crc kubenswrapper[4790]: I0406 12:16:05.082416 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591290-mqqrk"] Apr 06 12:16:05 crc kubenswrapper[4790]: I0406 12:16:05.688716 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ecda30-93ff-4f79-b211-a1e22749a64f" path="/var/lib/kubelet/pods/84ecda30-93ff-4f79-b211-a1e22749a64f/volumes" Apr 06 12:16:14 crc kubenswrapper[4790]: I0406 12:16:14.677165 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:16:15 crc kubenswrapper[4790]: I0406 12:16:15.865046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" event={"ID":"bb09d720-75af-43a7-90dd-e497d2933183","Type":"ContainerStarted","Data":"b89fe085cc5c2a3592b041adf0f1f4b1ad8828990b70bdd1086efd535f7f4e08"} Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.731369 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-d46gz" podStartSLOduration=19.061079142 podStartE2EDuration="1m15.731349083s" podCreationTimestamp="2026-04-06 12:15:16 +0000 UTC" firstStartedPulling="2026-04-06 12:15:18.490644693 +0000 UTC m=+1097.478387559" lastFinishedPulling="2026-04-06 12:16:15.160914624 +0000 UTC m=+1154.148657500" observedRunningTime="2026-04-06 12:16:15.890201996 +0000 UTC m=+1154.877944882" watchObservedRunningTime="2026-04-06 12:16:31.731349083 +0000 UTC m=+1170.719091949" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.739238 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:31 crc kubenswrapper[4790]: E0406 12:16:31.739666 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" containerName="oc" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.739691 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" containerName="oc" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.739901 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" containerName="oc" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.740798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.746471 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jj7xb" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.746783 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.746986 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.747160 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.752012 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.765105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.765229 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5nb6\" (UniqueName: \"kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.812112 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.815508 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.818762 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.844080 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.866522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5nb6\" (UniqueName: \"kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.866906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.866933 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksr2q\" (UniqueName: \"kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.867668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.867749 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.867804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.885790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5nb6\" (UniqueName: \"kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6\") pod \"dnsmasq-dns-5b8c7d564c-mddlq\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.968217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksr2q\" (UniqueName: \"kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.968274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.968305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.969188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.969378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:31 crc kubenswrapper[4790]: I0406 12:16:31.985635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksr2q\" (UniqueName: \"kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q\") pod \"dnsmasq-dns-85b7c785df-v2tqz\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:32 crc kubenswrapper[4790]: I0406 12:16:32.058232 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:32 crc kubenswrapper[4790]: I0406 12:16:32.144303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:32 crc kubenswrapper[4790]: I0406 12:16:32.520173 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:32 crc kubenswrapper[4790]: W0406 12:16:32.523109 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae4513e6_7add_478c_8f4c_8baf2b773b53.slice/crio-25365c0b22447614170d3f79b697b1041b418104c71348507e1e2336ab0f71ba WatchSource:0}: Error finding container 25365c0b22447614170d3f79b697b1041b418104c71348507e1e2336ab0f71ba: Status 404 returned error can't find the container with id 25365c0b22447614170d3f79b697b1041b418104c71348507e1e2336ab0f71ba Apr 06 12:16:32 crc kubenswrapper[4790]: I0406 12:16:32.591754 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:32 crc kubenswrapper[4790]: W0406 12:16:32.595068 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b945c7e_d215_4068_9892_44a4228a19bf.slice/crio-b54188c1a042b9e38bd07b455c30af34b08fc369cd073d81251eecd9fc0cb5fb WatchSource:0}: Error finding container b54188c1a042b9e38bd07b455c30af34b08fc369cd073d81251eecd9fc0cb5fb: Status 404 returned error can't find the container with id b54188c1a042b9e38bd07b455c30af34b08fc369cd073d81251eecd9fc0cb5fb Apr 06 12:16:33 crc kubenswrapper[4790]: I0406 12:16:33.022157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" event={"ID":"ae4513e6-7add-478c-8f4c-8baf2b773b53","Type":"ContainerStarted","Data":"25365c0b22447614170d3f79b697b1041b418104c71348507e1e2336ab0f71ba"} Apr 06 12:16:33 crc kubenswrapper[4790]: I0406 12:16:33.024120 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" event={"ID":"5b945c7e-d215-4068-9892-44a4228a19bf","Type":"ContainerStarted","Data":"b54188c1a042b9e38bd07b455c30af34b08fc369cd073d81251eecd9fc0cb5fb"} Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.614672 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.644128 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.648487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.661088 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.821389 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.821427 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.821515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45thz\" (UniqueName: \"kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.909287 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.922918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.922965 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.923175 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45thz\" (UniqueName: \"kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.940658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.943904 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.957161 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.960231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.977550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45thz\" (UniqueName: \"kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz\") pod \"dnsmasq-dns-67b849c865-mdxv2\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:35 crc kubenswrapper[4790]: I0406 12:16:35.984161 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.025618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.025664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgg7l\" (UniqueName: \"kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.025742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.127021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.127153 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.127189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgg7l\" (UniqueName: \"kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.128275 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.128359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.153546 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgg7l\" (UniqueName: \"kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l\") pod \"dnsmasq-dns-66b9c96c4c-zksjk\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.224398 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.224985 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.244353 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.245643 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.252358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.267696 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.431128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.431208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.431234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjfmh\" (UniqueName: \"kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.532734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.532812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.532855 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjfmh\" (UniqueName: \"kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.533544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.533613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.557845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjfmh\" (UniqueName: \"kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh\") pod \"dnsmasq-dns-5fccb695c7-r6qkc\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.563375 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.784219 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.785647 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794051 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794182 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794270 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794199 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6qf8" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794461 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.794506 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.803099 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.937911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.937957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.937985 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8kl\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938031 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938073 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938117 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:36 crc kubenswrapper[4790]: I0406 12:16:36.938305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040337 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8kl\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.040596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.042053 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.042083 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.043931 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.044465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.044891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.046005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.046224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.046950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.048454 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.049187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.057866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8kl\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.066514 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.091251 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.092646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098074 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098283 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098409 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f9sl4" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098633 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.098730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.101886 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.109107 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.119353 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.243787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.243848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244059 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244190 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dwm\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.244371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349639 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349823 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.349973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwm\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.350979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.353967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.354943 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.356018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.356126 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.356813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.358099 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.365893 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.368003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.368304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.369017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.380883 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.382685 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-plugins-conf" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.382876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.383139 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-server-dockercfg-d7vdl" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.383265 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-server-conf" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.383373 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"notifications-rabbitmq-config-data" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.383638 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-default-user" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.386263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.398752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-notifications-rabbitmq-svc" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.399133 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"notifications-rabbitmq-erlang-cookie" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.411569 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwm\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm\") pod \"rabbitmq-cell1-server-0\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.437545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.557881 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.557942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.557977 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/586bc227-b2c5-4ead-88f4-fe18c5c28d41-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558090 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558130 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/586bc227-b2c5-4ead-88f4-fe18c5c28d41-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.558280 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nsl\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-kube-api-access-v7nsl\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/586bc227-b2c5-4ead-88f4-fe18c5c28d41-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nsl\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-kube-api-access-v7nsl\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/586bc227-b2c5-4ead-88f4-fe18c5c28d41-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660663 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.660731 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.661274 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.662382 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-server-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.663066 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-config-data\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.667363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-confd\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.667451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/586bc227-b2c5-4ead-88f4-fe18c5c28d41-erlang-cookie-secret\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.667758 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-plugins\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.667772 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-erlang-cookie\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.668935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/586bc227-b2c5-4ead-88f4-fe18c5c28d41-plugins-conf\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.670361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-rabbitmq-tls\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.674116 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/586bc227-b2c5-4ead-88f4-fe18c5c28d41-pod-info\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.679793 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nsl\" (UniqueName: \"kubernetes.io/projected/586bc227-b2c5-4ead-88f4-fe18c5c28d41-kube-api-access-v7nsl\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.685586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"notifications-rabbitmq-server-0\" (UID: \"586bc227-b2c5-4ead-88f4-fe18c5c28d41\") " pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:37 crc kubenswrapper[4790]: I0406 12:16:37.793654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.613974 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.615561 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.620870 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.621043 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-z264f" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.622018 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.622180 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.626317 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.635231 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.779994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.780041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.780134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.780167 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.780492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.780603 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.781283 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.781311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrtp6\" (UniqueName: \"kubernetes.io/projected/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kube-api-access-hrtp6\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.883955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrtp6\" (UniqueName: \"kubernetes.io/projected/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kube-api-access-hrtp6\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.884004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.884026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.884517 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.885981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.886085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-config-data-default\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.886092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kolla-config\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.887300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.892000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.901674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.917384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrtp6\" (UniqueName: \"kubernetes.io/projected/6d6dc6ce-5627-454a-af1c-7a20bed8bfc4-kube-api-access-hrtp6\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:38 crc kubenswrapper[4790]: I0406 12:16:38.924837 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4\") " pod="openstack/openstack-galera-0" Apr 06 12:16:39 crc kubenswrapper[4790]: I0406 12:16:39.233235 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.013172 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.014441 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.017448 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.017808 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hbkpj" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.018106 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.018161 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.029205 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106429 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106451 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhvpz\" (UniqueName: \"kubernetes.io/projected/69e97903-5aa8-4523-ae3c-3f10b031ad20-kube-api-access-lhvpz\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.106706 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208460 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208524 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208600 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhvpz\" (UniqueName: \"kubernetes.io/projected/69e97903-5aa8-4523-ae3c-3f10b031ad20-kube-api-access-lhvpz\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.208736 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.209175 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.209519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.209706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.210332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.210609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e97903-5aa8-4523-ae3c-3f10b031ad20-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.213729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.217552 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e97903-5aa8-4523-ae3c-3f10b031ad20-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.233336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhvpz\" (UniqueName: \"kubernetes.io/projected/69e97903-5aa8-4523-ae3c-3f10b031ad20-kube-api-access-lhvpz\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.234859 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69e97903-5aa8-4523-ae3c-3f10b031ad20\") " pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.365957 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.367244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.371104 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.371132 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-d22rb" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.371338 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.380138 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.388012 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.426238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kolla-config\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.427006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.427157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.427207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pxct\" (UniqueName: \"kubernetes.io/projected/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kube-api-access-2pxct\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.427361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-config-data\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.529113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.529162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pxct\" (UniqueName: \"kubernetes.io/projected/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kube-api-access-2pxct\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.529205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-config-data\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.529243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kolla-config\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.529286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.530685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-config-data\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.532023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kolla-config\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.536568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.536545 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.551264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pxct\" (UniqueName: \"kubernetes.io/projected/ace98862-e7bc-4eb8-93ae-b38dcbd98a55-kube-api-access-2pxct\") pod \"memcached-0\" (UID: \"ace98862-e7bc-4eb8-93ae-b38dcbd98a55\") " pod="openstack/memcached-0" Apr 06 12:16:40 crc kubenswrapper[4790]: I0406 12:16:40.687222 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.582289 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.583839 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.586793 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lmnfl" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.594908 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.666451 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqt5\" (UniqueName: \"kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5\") pod \"kube-state-metrics-0\" (UID: \"e2ad9a9e-f3a7-4749-9838-be88d29a7494\") " pod="openstack/kube-state-metrics-0" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.767646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqt5\" (UniqueName: \"kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5\") pod \"kube-state-metrics-0\" (UID: \"e2ad9a9e-f3a7-4749-9838-be88d29a7494\") " pod="openstack/kube-state-metrics-0" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.808171 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqt5\" (UniqueName: \"kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5\") pod \"kube-state-metrics-0\" (UID: \"e2ad9a9e-f3a7-4749-9838-be88d29a7494\") " pod="openstack/kube-state-metrics-0" Apr 06 12:16:42 crc kubenswrapper[4790]: I0406 12:16:42.903008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.802334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.908318 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.912620 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.920253 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922082 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gtk7n" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922164 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922374 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922634 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922705 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922650 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.922787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.936443 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgfx\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:43 crc kubenswrapper[4790]: I0406 12:16:43.992763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.106197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.106563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.107215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgfx\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.107567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.107696 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.108207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.107120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.106971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.112677 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.112761 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2657789c2cd73a331030693f07b47cf0a3bc578270043bd3878090c8357096a6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.114490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.116457 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.117285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.118623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.120235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.120617 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:44 crc kubenswrapper[4790]: E0406 12:16:44.121781 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vxgfx prometheus-metric-storage-db], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/prometheus-metric-storage-0" podUID="06b0fb18-4ad0-470e-b436-56d1530a118f" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.127365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgfx\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.148987 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.152559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.193564 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312138 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312304 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgfx\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312330 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312466 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312510 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.312563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets\") pod \"06b0fb18-4ad0-470e-b436-56d1530a118f\" (UID: \"06b0fb18-4ad0-470e-b436-56d1530a118f\") " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.314342 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.314367 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.315066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.316444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out" (OuterVolumeSpecName: "config-out") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.317176 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.317328 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config" (OuterVolumeSpecName: "web-config") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.317685 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.320417 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config" (OuterVolumeSpecName: "config") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.324900 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx" (OuterVolumeSpecName: "kube-api-access-vxgfx") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "kube-api-access-vxgfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.333708 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "06b0fb18-4ad0-470e-b436-56d1530a118f" (UID: "06b0fb18-4ad0-470e-b436-56d1530a118f"). InnerVolumeSpecName "pvc-428e608e-3b0f-419c-8722-244ca6b44799". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413837 4790 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-web-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413868 4790 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-tls-assets\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413909 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" " Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413922 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413932 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxgfx\" (UniqueName: \"kubernetes.io/projected/06b0fb18-4ad0-470e-b436-56d1530a118f-kube-api-access-vxgfx\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413941 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413952 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413961 4790 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06b0fb18-4ad0-470e-b436-56d1530a118f-config-out\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413970 4790 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06b0fb18-4ad0-470e-b436-56d1530a118f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.413981 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/06b0fb18-4ad0-470e-b436-56d1530a118f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.430926 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.431070 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-428e608e-3b0f-419c-8722-244ca6b44799" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799") on node "crc" Apr 06 12:16:44 crc kubenswrapper[4790]: I0406 12:16:44.515038 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.161658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.226129 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.230442 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.248405 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.251632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.254228 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.254310 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.254492 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.254930 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.255059 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.258485 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.258708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gtk7n" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.259743 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.261434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.328715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.328874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrkwn\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.328943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329045 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329276 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329318 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.329341 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431886 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.431965 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432109 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrkwn\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.432716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.433248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.435296 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.435339 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2657789c2cd73a331030693f07b47cf0a3bc578270043bd3878090c8357096a6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.437891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.437915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.437915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.444679 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.448803 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.451487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrkwn\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.480134 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.593775 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.663549 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pr4b9"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.664623 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.672070 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4xbzr"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.672612 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.673642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.687341 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b0fb18-4ad0-470e-b436-56d1530a118f" path="/var/lib/kubelet/pods/06b0fb18-4ad0-470e-b436-56d1530a118f/volumes" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.719587 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.721772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8rpmh" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.733109 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pr4b9"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-lib\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-etc-ovs\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51949d72-301c-4426-8397-273f6b2ecabd-scripts\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c164024-c78c-461b-90fc-afbac0b3a682-scripts\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-log-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-log\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-ovn-controller-tls-certs\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737592 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvhh\" (UniqueName: \"kubernetes.io/projected/3c164024-c78c-461b-90fc-afbac0b3a682-kube-api-access-5gvhh\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwxkk\" (UniqueName: \"kubernetes.io/projected/51949d72-301c-4426-8397-273f6b2ecabd-kube-api-access-dwxkk\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-run\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737702 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.737727 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-combined-ca-bundle\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.748646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4xbzr"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-etc-ovs\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-lib\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51949d72-301c-4426-8397-273f6b2ecabd-scripts\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845136 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c164024-c78c-461b-90fc-afbac0b3a682-scripts\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845667 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-etc-ovs\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.845817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-lib\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51949d72-301c-4426-8397-273f6b2ecabd-scripts\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847366 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-log-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-log\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847436 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-ovn-controller-tls-certs\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvhh\" (UniqueName: \"kubernetes.io/projected/3c164024-c78c-461b-90fc-afbac0b3a682-kube-api-access-5gvhh\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwxkk\" (UniqueName: \"kubernetes.io/projected/51949d72-301c-4426-8397-273f6b2ecabd-kube-api-access-dwxkk\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-run\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.847593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-combined-ca-bundle\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.848266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.848397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-log\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.848448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-log-ovn\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.848490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/51949d72-301c-4426-8397-273f6b2ecabd-var-run\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.848499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3c164024-c78c-461b-90fc-afbac0b3a682-var-run\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.849117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c164024-c78c-461b-90fc-afbac0b3a682-scripts\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.850460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-combined-ca-bundle\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.866110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/51949d72-301c-4426-8397-273f6b2ecabd-ovn-controller-tls-certs\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.867625 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwxkk\" (UniqueName: \"kubernetes.io/projected/51949d72-301c-4426-8397-273f6b2ecabd-kube-api-access-dwxkk\") pod \"ovn-controller-pr4b9\" (UID: \"51949d72-301c-4426-8397-273f6b2ecabd\") " pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.869918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvhh\" (UniqueName: \"kubernetes.io/projected/3c164024-c78c-461b-90fc-afbac0b3a682-kube-api-access-5gvhh\") pod \"ovn-controller-ovs-4xbzr\" (UID: \"3c164024-c78c-461b-90fc-afbac0b3a682\") " pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.883601 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.885340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.895936 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.896184 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.896397 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zwlkl" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.896529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.899454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.925131 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.954708 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.954791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.954910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-config\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.955021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.955075 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.955113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.955139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpm5s\" (UniqueName: \"kubernetes.io/projected/6192fb44-8c5c-4bee-a190-cb14bce3fa94-kube-api-access-lpm5s\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:45 crc kubenswrapper[4790]: I0406 12:16:45.955179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.036016 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-config\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.056665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.057110 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.057524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-config\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.057578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpm5s\" (UniqueName: \"kubernetes.io/projected/6192fb44-8c5c-4bee-a190-cb14bce3fa94-kube-api-access-lpm5s\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.057606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.058017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.058385 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6192fb44-8c5c-4bee-a190-cb14bce3fa94-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.062978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.064076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.065161 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.087359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6192fb44-8c5c-4bee-a190-cb14bce3fa94-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.091597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpm5s\" (UniqueName: \"kubernetes.io/projected/6192fb44-8c5c-4bee-a190-cb14bce3fa94-kube-api-access-lpm5s\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.109092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6192fb44-8c5c-4bee-a190-cb14bce3fa94\") " pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:46 crc kubenswrapper[4790]: I0406 12:16:46.247651 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.059334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.062676 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.068462 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.068948 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c7fcv" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.069332 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.069953 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.072657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.126586 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.126668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2vp\" (UniqueName: \"kubernetes.io/projected/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-kube-api-access-2d2vp\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.126728 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.126930 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.126976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.127010 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.127084 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.127134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2vp\" (UniqueName: \"kubernetes.io/projected/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-kube-api-access-2d2vp\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228374 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228397 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.228994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.229089 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.229894 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.229960 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-config\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.235891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.236455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.245046 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.254585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2vp\" (UniqueName: \"kubernetes.io/projected/b7fb6737-ce1d-42b7-96e4-f1ea27883d05-kube-api-access-2d2vp\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.275540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b7fb6737-ce1d-42b7-96e4-f1ea27883d05\") " pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:50 crc kubenswrapper[4790]: I0406 12:16:50.393040 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.034256 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.034494 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.034623 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5nb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5b8c7d564c-mddlq_openstack(ae4513e6-7add-478c-8f4c-8baf2b773b53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.037037 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" podUID="ae4513e6-7add-478c-8f4c-8baf2b773b53" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.046048 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.046097 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.046204 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ksr2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85b7c785df-v2tqz_openstack(5b945c7e-d215-4068-9892-44a4228a19bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:16:51 crc kubenswrapper[4790]: E0406 12:16:51.049978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" podUID="5b945c7e-d215-4068-9892-44a4228a19bf" Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.244946 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4","Type":"ContainerStarted","Data":"1d112cfa30d5287dbcbee898c09f232a595375c7273c8c5dcf26ee9e909046f1"} Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.659458 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.862283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.874195 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.880945 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.888556 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.926269 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:51 crc kubenswrapper[4790]: I0406 12:16:51.930647 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.055372 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5nb6\" (UniqueName: \"kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6\") pod \"ae4513e6-7add-478c-8f4c-8baf2b773b53\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.056123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc\") pod \"5b945c7e-d215-4068-9892-44a4228a19bf\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.056201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config\") pod \"5b945c7e-d215-4068-9892-44a4228a19bf\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.056293 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksr2q\" (UniqueName: \"kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q\") pod \"5b945c7e-d215-4068-9892-44a4228a19bf\" (UID: \"5b945c7e-d215-4068-9892-44a4228a19bf\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.056371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config\") pod \"ae4513e6-7add-478c-8f4c-8baf2b773b53\" (UID: \"ae4513e6-7add-478c-8f4c-8baf2b773b53\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.056650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config" (OuterVolumeSpecName: "config") pod "5b945c7e-d215-4068-9892-44a4228a19bf" (UID: "5b945c7e-d215-4068-9892-44a4228a19bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.057132 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.057263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config" (OuterVolumeSpecName: "config") pod "ae4513e6-7add-478c-8f4c-8baf2b773b53" (UID: "ae4513e6-7add-478c-8f4c-8baf2b773b53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.057341 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b945c7e-d215-4068-9892-44a4228a19bf" (UID: "5b945c7e-d215-4068-9892-44a4228a19bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.062846 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q" (OuterVolumeSpecName: "kube-api-access-ksr2q") pod "5b945c7e-d215-4068-9892-44a4228a19bf" (UID: "5b945c7e-d215-4068-9892-44a4228a19bf"). InnerVolumeSpecName "kube-api-access-ksr2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.063395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6" (OuterVolumeSpecName: "kube-api-access-n5nb6") pod "ae4513e6-7add-478c-8f4c-8baf2b773b53" (UID: "ae4513e6-7add-478c-8f4c-8baf2b773b53"). InnerVolumeSpecName "kube-api-access-n5nb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.159201 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksr2q\" (UniqueName: \"kubernetes.io/projected/5b945c7e-d215-4068-9892-44a4228a19bf-kube-api-access-ksr2q\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.159641 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4513e6-7add-478c-8f4c-8baf2b773b53-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.159665 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5nb6\" (UniqueName: \"kubernetes.io/projected/ae4513e6-7add-478c-8f4c-8baf2b773b53-kube-api-access-n5nb6\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.159678 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b945c7e-d215-4068-9892-44a4228a19bf-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.257719 4790 generic.go:334] "Generic (PLEG): container finished" podID="7474f479-358e-4ca8-8930-8ba9a763bb8c" containerID="375cae6b036c406fd74eb7520845bbd71ac9683918e5d5794b332bea7d838e44" exitCode=0 Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.257792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" event={"ID":"7474f479-358e-4ca8-8930-8ba9a763bb8c","Type":"ContainerDied","Data":"375cae6b036c406fd74eb7520845bbd71ac9683918e5d5794b332bea7d838e44"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.257823 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" event={"ID":"7474f479-358e-4ca8-8930-8ba9a763bb8c","Type":"ContainerStarted","Data":"d5daee3191e50c8cd7305559ad9b9f13e58426379cca2246a3763f28b83e449c"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.268022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" event={"ID":"ae4513e6-7add-478c-8f4c-8baf2b773b53","Type":"ContainerDied","Data":"25365c0b22447614170d3f79b697b1041b418104c71348507e1e2336ab0f71ba"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.268530 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8c7d564c-mddlq" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.297166 4790 generic.go:334] "Generic (PLEG): container finished" podID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerID="6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850" exitCode=0 Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.297248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" event={"ID":"d195497c-d743-4fef-a50e-b2b600bcc34f","Type":"ContainerDied","Data":"6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.297278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" event={"ID":"d195497c-d743-4fef-a50e-b2b600bcc34f","Type":"ContainerStarted","Data":"45d19d0993496db587b7c4b68bb6b68c213de127450e4f3f5c43f5555597739f"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.312000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" event={"ID":"5b945c7e-d215-4068-9892-44a4228a19bf","Type":"ContainerDied","Data":"b54188c1a042b9e38bd07b455c30af34b08fc369cd073d81251eecd9fc0cb5fb"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.312075 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b7c785df-v2tqz" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.326441 4790 generic.go:334] "Generic (PLEG): container finished" podID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerID="d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211" exitCode=0 Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.326514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" event={"ID":"290daa37-5ba2-47fc-841b-d89aed9745d2","Type":"ContainerDied","Data":"d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.326543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" event={"ID":"290daa37-5ba2-47fc-841b-d89aed9745d2","Type":"ContainerStarted","Data":"ab94057a987b8ee442f7e97faec709d1fa2fd2d911503e7488583c3f539feb8e"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.333443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.333487 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ace98862-e7bc-4eb8-93ae-b38dcbd98a55","Type":"ContainerStarted","Data":"c4636214afcdf1116375aff844fbcdea2accf722614e6c8588382d4a58b2ee96"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.340503 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerStarted","Data":"41b5be6ba4b634574a3925e82cf926675350310d9e5119ffcf7f54a9441850f5"} Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.341873 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.349488 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pr4b9"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.372055 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/notifications-rabbitmq-server-0"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.386459 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.501907 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.513571 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85b7c785df-v2tqz"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.520357 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4xbzr"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.534968 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.541622 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b8c7d564c-mddlq"] Apr 06 12:16:52 crc kubenswrapper[4790]: E0406 12:16:52.607118 4790 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Apr 06 12:16:52 crc kubenswrapper[4790]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d195497c-d743-4fef-a50e-b2b600bcc34f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Apr 06 12:16:52 crc kubenswrapper[4790]: > podSandboxID="45d19d0993496db587b7c4b68bb6b68c213de127450e4f3f5c43f5555597739f" Apr 06 12:16:52 crc kubenswrapper[4790]: E0406 12:16:52.607616 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Apr 06 12:16:52 crc kubenswrapper[4790]: container &Container{Name:dnsmasq-dns,Image:38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45thz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67b849c865-mdxv2_openstack(d195497c-d743-4fef-a50e-b2b600bcc34f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d195497c-d743-4fef-a50e-b2b600bcc34f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Apr 06 12:16:52 crc kubenswrapper[4790]: > logger="UnhandledError" Apr 06 12:16:52 crc kubenswrapper[4790]: E0406 12:16:52.609495 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d195497c-d743-4fef-a50e-b2b600bcc34f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" Apr 06 12:16:52 crc kubenswrapper[4790]: W0406 12:16:52.652671 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c164024_c78c_461b_90fc_afbac0b3a682.slice/crio-579c1d4079f12c5d1f3704d27957c59621ef728d4c0c20022d2bc9a56a8878c5 WatchSource:0}: Error finding container 579c1d4079f12c5d1f3704d27957c59621ef728d4c0c20022d2bc9a56a8878c5: Status 404 returned error can't find the container with id 579c1d4079f12c5d1f3704d27957c59621ef728d4c0c20022d2bc9a56a8878c5 Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.772540 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.773238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config\") pod \"7474f479-358e-4ca8-8930-8ba9a763bb8c\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.773419 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgg7l\" (UniqueName: \"kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l\") pod \"7474f479-358e-4ca8-8930-8ba9a763bb8c\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.773450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc\") pod \"7474f479-358e-4ca8-8930-8ba9a763bb8c\" (UID: \"7474f479-358e-4ca8-8930-8ba9a763bb8c\") " Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.778107 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l" (OuterVolumeSpecName: "kube-api-access-lgg7l") pod "7474f479-358e-4ca8-8930-8ba9a763bb8c" (UID: "7474f479-358e-4ca8-8930-8ba9a763bb8c"). InnerVolumeSpecName "kube-api-access-lgg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.808674 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config" (OuterVolumeSpecName: "config") pod "7474f479-358e-4ca8-8930-8ba9a763bb8c" (UID: "7474f479-358e-4ca8-8930-8ba9a763bb8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.816379 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7474f479-358e-4ca8-8930-8ba9a763bb8c" (UID: "7474f479-358e-4ca8-8930-8ba9a763bb8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.876155 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.876184 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgg7l\" (UniqueName: \"kubernetes.io/projected/7474f479-358e-4ca8-8930-8ba9a763bb8c-kube-api-access-lgg7l\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:52 crc kubenswrapper[4790]: I0406 12:16:52.876194 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7474f479-358e-4ca8-8930-8ba9a763bb8c-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.353789 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4xbzr" event={"ID":"3c164024-c78c-461b-90fc-afbac0b3a682","Type":"ContainerStarted","Data":"579c1d4079f12c5d1f3704d27957c59621ef728d4c0c20022d2bc9a56a8878c5"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.354897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerStarted","Data":"234ca7ad56e805696dfb4d0628670f15c2961765c587a343becbd710eb66ddd6"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.355018 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.358591 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" event={"ID":"290daa37-5ba2-47fc-841b-d89aed9745d2","Type":"ContainerStarted","Data":"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.358780 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.360712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69e97903-5aa8-4523-ae3c-3f10b031ad20","Type":"ContainerStarted","Data":"8514a307007c12fb20c7d6ae048d88ecdb35401a6f596c57f34ef251c732030b"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.370718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2ad9a9e-f3a7-4749-9838-be88d29a7494","Type":"ContainerStarted","Data":"809658d936309e1e08e768c1d749d29951bdece531257ecf895d70d745a4c0ef"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.379764 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pr4b9" event={"ID":"51949d72-301c-4426-8397-273f6b2ecabd","Type":"ContainerStarted","Data":"631ce1c66d7f97b10e2f1c4a66ccc11bf25e58a9e295455c3040a2690a7796f2"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.382723 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.382723 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b9c96c4c-zksjk" event={"ID":"7474f479-358e-4ca8-8930-8ba9a763bb8c","Type":"ContainerDied","Data":"d5daee3191e50c8cd7305559ad9b9f13e58426379cca2246a3763f28b83e449c"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.382873 4790 scope.go:117] "RemoveContainer" containerID="375cae6b036c406fd74eb7520845bbd71ac9683918e5d5794b332bea7d838e44" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.384108 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" podStartSLOduration=17.384074967 podStartE2EDuration="17.384074967s" podCreationTimestamp="2026-04-06 12:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:16:53.379672783 +0000 UTC m=+1192.367415649" watchObservedRunningTime="2026-04-06 12:16:53.384074967 +0000 UTC m=+1192.371817833" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.385667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"586bc227-b2c5-4ead-88f4-fe18c5c28d41","Type":"ContainerStarted","Data":"2a84ddef9b052cb4b7994292a071bdc3be75818aa38512fbec3b86c5a423eb0e"} Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.452089 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.462364 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b9c96c4c-zksjk"] Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.511192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.693340 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b945c7e-d215-4068-9892-44a4228a19bf" path="/var/lib/kubelet/pods/5b945c7e-d215-4068-9892-44a4228a19bf/volumes" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.694035 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7474f479-358e-4ca8-8930-8ba9a763bb8c" path="/var/lib/kubelet/pods/7474f479-358e-4ca8-8930-8ba9a763bb8c/volumes" Apr 06 12:16:53 crc kubenswrapper[4790]: I0406 12:16:53.694759 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4513e6-7add-478c-8f4c-8baf2b773b53" path="/var/lib/kubelet/pods/ae4513e6-7add-478c-8f4c-8baf2b773b53/volumes" Apr 06 12:16:53 crc kubenswrapper[4790]: W0406 12:16:53.930324 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4080ebfd_9998_4f7d_803a_bf407d0adef0.slice/crio-cdf63db51a6bd98baad2baa2d27d5429f9e4d2ae6c3b4ec281f826c9d5cb462a WatchSource:0}: Error finding container cdf63db51a6bd98baad2baa2d27d5429f9e4d2ae6c3b4ec281f826c9d5cb462a: Status 404 returned error can't find the container with id cdf63db51a6bd98baad2baa2d27d5429f9e4d2ae6c3b4ec281f826c9d5cb462a Apr 06 12:16:54 crc kubenswrapper[4790]: I0406 12:16:54.394316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerStarted","Data":"cdf63db51a6bd98baad2baa2d27d5429f9e4d2ae6c3b4ec281f826c9d5cb462a"} Apr 06 12:16:54 crc kubenswrapper[4790]: I0406 12:16:54.395550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6192fb44-8c5c-4bee-a190-cb14bce3fa94","Type":"ContainerStarted","Data":"101cf5040d0238a134869e67bba814067955420abcc3a9e582cfedc3c7fc0ae6"} Apr 06 12:16:54 crc kubenswrapper[4790]: I0406 12:16:54.427269 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Apr 06 12:16:56 crc kubenswrapper[4790]: I0406 12:16:56.414573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7fb6737-ce1d-42b7-96e4-f1ea27883d05","Type":"ContainerStarted","Data":"1cbcd8a8d81ed59c9105eceea27ac1059d2be83319a07f163e92d42bb317a3da"} Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.457179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" event={"ID":"d195497c-d743-4fef-a50e-b2b600bcc34f","Type":"ContainerStarted","Data":"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c"} Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.457934 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.460260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ace98862-e7bc-4eb8-93ae-b38dcbd98a55","Type":"ContainerStarted","Data":"7982bc073679f388fc2b1d864f4600aa0ba6c5ddd0b2ca9d9e09e7dcdf0b75fd"} Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.460599 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.465669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4","Type":"ContainerStarted","Data":"85475fe50b07c3341a36c8eac26b68d83314151c8b24e94c5f02557dcacce4c7"} Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.478279 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" podStartSLOduration=25.421947856 podStartE2EDuration="25.478259237s" podCreationTimestamp="2026-04-06 12:16:35 +0000 UTC" firstStartedPulling="2026-04-06 12:16:51.660436321 +0000 UTC m=+1190.648179207" lastFinishedPulling="2026-04-06 12:16:51.716747722 +0000 UTC m=+1190.704490588" observedRunningTime="2026-04-06 12:17:00.474125661 +0000 UTC m=+1199.461868537" watchObservedRunningTime="2026-04-06 12:17:00.478259237 +0000 UTC m=+1199.466002103" Apr 06 12:17:00 crc kubenswrapper[4790]: I0406 12:17:00.499044 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.083064927 podStartE2EDuration="20.499022652s" podCreationTimestamp="2026-04-06 12:16:40 +0000 UTC" firstStartedPulling="2026-04-06 12:16:51.899331158 +0000 UTC m=+1190.887074024" lastFinishedPulling="2026-04-06 12:16:59.315288873 +0000 UTC m=+1198.303031749" observedRunningTime="2026-04-06 12:17:00.490687108 +0000 UTC m=+1199.478429974" watchObservedRunningTime="2026-04-06 12:17:00.499022652 +0000 UTC m=+1199.486765518" Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.477708 4790 generic.go:334] "Generic (PLEG): container finished" podID="3c164024-c78c-461b-90fc-afbac0b3a682" containerID="7b21fc4615804cd43ea3d8dbce5db89a497158c34d3b1f6cc21bacd34725e87a" exitCode=0 Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.477804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4xbzr" event={"ID":"3c164024-c78c-461b-90fc-afbac0b3a682","Type":"ContainerDied","Data":"7b21fc4615804cd43ea3d8dbce5db89a497158c34d3b1f6cc21bacd34725e87a"} Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.480419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7fb6737-ce1d-42b7-96e4-f1ea27883d05","Type":"ContainerStarted","Data":"b2a19269116bf8839952eb85a397a182445d00260a2567b46eedbd1d2c00b2e6"} Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.483357 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pr4b9" event={"ID":"51949d72-301c-4426-8397-273f6b2ecabd","Type":"ContainerStarted","Data":"cdfd64ccb69558687f6f3e40aa3f4fe9863b72aef1b1f3e68d15ba54d0da4880"} Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.483769 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-pr4b9" Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.536498 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pr4b9" podStartSLOduration=9.559806491 podStartE2EDuration="16.536473503s" podCreationTimestamp="2026-04-06 12:16:45 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.352092358 +0000 UTC m=+1191.339835224" lastFinishedPulling="2026-04-06 12:16:59.32875937 +0000 UTC m=+1198.316502236" observedRunningTime="2026-04-06 12:17:01.52939042 +0000 UTC m=+1200.517133326" watchObservedRunningTime="2026-04-06 12:17:01.536473503 +0000 UTC m=+1200.524216379" Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.565093 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:17:01 crc kubenswrapper[4790]: I0406 12:17:01.657573 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.493535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerStarted","Data":"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4"} Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.495639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69e97903-5aa8-4523-ae3c-3f10b031ad20","Type":"ContainerStarted","Data":"b4c8d43b79adba42ec6dbcfb4527bf9652a90a30634ef7576d04b76547c7df25"} Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.498899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2ad9a9e-f3a7-4749-9838-be88d29a7494","Type":"ContainerStarted","Data":"3d9de21481224150864c525cf0720a94ee01cfcf245f69c9e55e4f8ccc97e649"} Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.499491 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.500632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6192fb44-8c5c-4bee-a190-cb14bce3fa94","Type":"ContainerStarted","Data":"dee7473314fd4bc8254ed4f8cf9a17e4ab7d71120362cd5ce9e8b7bf1b2fb65d"} Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.502618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"586bc227-b2c5-4ead-88f4-fe18c5c28d41","Type":"ContainerStarted","Data":"d3886c30796f5336ed1952d510714c84938def6712de8559f323ea3f1672c8ab"} Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.502685 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="dnsmasq-dns" containerID="cri-o://51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c" gracePeriod=10 Apr 06 12:17:02 crc kubenswrapper[4790]: I0406 12:17:02.553750 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.619422723 podStartE2EDuration="20.553727302s" podCreationTimestamp="2026-04-06 12:16:42 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.423124419 +0000 UTC m=+1191.410867285" lastFinishedPulling="2026-04-06 12:17:01.357429008 +0000 UTC m=+1200.345171864" observedRunningTime="2026-04-06 12:17:02.545988933 +0000 UTC m=+1201.533731799" watchObservedRunningTime="2026-04-06 12:17:02.553727302 +0000 UTC m=+1201.541470168" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.159173 4790 scope.go:117] "RemoveContainer" containerID="fb1881b537ea1c340d1ef7a2c8bcfbf79af6e760ad216f4829c53d80dcf9f0ce" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.375674 4790 scope.go:117] "RemoveContainer" containerID="c4e20f5c8f56f0ff852d036487346ea149f46091693034720b9e6c7d67f4e90e" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.490434 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.502498 4790 scope.go:117] "RemoveContainer" containerID="36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.551664 4790 scope.go:117] "RemoveContainer" containerID="5039e752e9e54f5677c91b0a5014118c1596eb393a678217f8db86f133c6ba8e" Apr 06 12:17:03 crc kubenswrapper[4790]: E0406 12:17:03.555680 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5\": container with ID starting with 36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5 not found: ID does not exist" containerID="36e2aee716adf26bb016189bd4e8acf3662664d07dc7de84253a260da5586ce5" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.560749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerStarted","Data":"b76c9ae59236ea315c6cccadf1328f37ec5447b747f22a2a7693f56a1f29a416"} Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.563641 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4xbzr" event={"ID":"3c164024-c78c-461b-90fc-afbac0b3a682","Type":"ContainerStarted","Data":"b90862c7e2e385b2c22d4b3ec3967a7cdfb3997e6bebee0224ed593a50f812eb"} Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.566655 4790 generic.go:334] "Generic (PLEG): container finished" podID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerID="51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c" exitCode=0 Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.566715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" event={"ID":"d195497c-d743-4fef-a50e-b2b600bcc34f","Type":"ContainerDied","Data":"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c"} Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.566725 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.566741 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b849c865-mdxv2" event={"ID":"d195497c-d743-4fef-a50e-b2b600bcc34f","Type":"ContainerDied","Data":"45d19d0993496db587b7c4b68bb6b68c213de127450e4f3f5c43f5555597739f"} Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.566760 4790 scope.go:117] "RemoveContainer" containerID="51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.572176 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerStarted","Data":"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d"} Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.602980 4790 scope.go:117] "RemoveContainer" containerID="6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.630274 4790 scope.go:117] "RemoveContainer" containerID="51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c" Apr 06 12:17:03 crc kubenswrapper[4790]: E0406 12:17:03.630628 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c\": container with ID starting with 51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c not found: ID does not exist" containerID="51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.630655 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c"} err="failed to get container status \"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c\": rpc error: code = NotFound desc = could not find container \"51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c\": container with ID starting with 51973b938085bbc120ea5ca808c60f2a000e0129ec70acf3d3b1e20554b5254c not found: ID does not exist" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.630677 4790 scope.go:117] "RemoveContainer" containerID="6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850" Apr 06 12:17:03 crc kubenswrapper[4790]: E0406 12:17:03.631160 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850\": container with ID starting with 6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850 not found: ID does not exist" containerID="6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.631210 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850"} err="failed to get container status \"6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850\": rpc error: code = NotFound desc = could not find container \"6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850\": container with ID starting with 6d01777a861172441197b1fcba1fa40215e4e8782458d0f95ce9be73c7b87850 not found: ID does not exist" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.674571 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45thz\" (UniqueName: \"kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz\") pod \"d195497c-d743-4fef-a50e-b2b600bcc34f\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.674643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc\") pod \"d195497c-d743-4fef-a50e-b2b600bcc34f\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.674730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config\") pod \"d195497c-d743-4fef-a50e-b2b600bcc34f\" (UID: \"d195497c-d743-4fef-a50e-b2b600bcc34f\") " Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.686972 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz" (OuterVolumeSpecName: "kube-api-access-45thz") pod "d195497c-d743-4fef-a50e-b2b600bcc34f" (UID: "d195497c-d743-4fef-a50e-b2b600bcc34f"). InnerVolumeSpecName "kube-api-access-45thz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.723070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config" (OuterVolumeSpecName: "config") pod "d195497c-d743-4fef-a50e-b2b600bcc34f" (UID: "d195497c-d743-4fef-a50e-b2b600bcc34f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.723300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d195497c-d743-4fef-a50e-b2b600bcc34f" (UID: "d195497c-d743-4fef-a50e-b2b600bcc34f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.777678 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45thz\" (UniqueName: \"kubernetes.io/projected/d195497c-d743-4fef-a50e-b2b600bcc34f-kube-api-access-45thz\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.777714 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.777726 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d195497c-d743-4fef-a50e-b2b600bcc34f-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.907130 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:17:03 crc kubenswrapper[4790]: I0406 12:17:03.910450 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b849c865-mdxv2"] Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.583391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b7fb6737-ce1d-42b7-96e4-f1ea27883d05","Type":"ContainerStarted","Data":"fd1cf1e7b9c4bce1e9919689051ae6cd926f38216092b28ad43bbcf401ff32f7"} Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.587552 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6192fb44-8c5c-4bee-a190-cb14bce3fa94","Type":"ContainerStarted","Data":"c51db3b55c4d76e5723ff7c1b3f6a9bba39d505f3c3d6fbddedf2f07db97ecb9"} Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.591086 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4xbzr" event={"ID":"3c164024-c78c-461b-90fc-afbac0b3a682","Type":"ContainerStarted","Data":"1d5fe16afc67e725c6f909135948fcb0c15277f1b36e2177b2157552cd528a03"} Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.591297 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.591529 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.609149 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.414483805 podStartE2EDuration="15.609128149s" podCreationTimestamp="2026-04-06 12:16:49 +0000 UTC" firstStartedPulling="2026-04-06 12:16:55.573010026 +0000 UTC m=+1194.560752892" lastFinishedPulling="2026-04-06 12:17:03.76765437 +0000 UTC m=+1202.755397236" observedRunningTime="2026-04-06 12:17:04.607769354 +0000 UTC m=+1203.595512260" watchObservedRunningTime="2026-04-06 12:17:04.609128149 +0000 UTC m=+1203.596871025" Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.634076 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4xbzr" podStartSLOduration=12.961051408 podStartE2EDuration="19.634048492s" podCreationTimestamp="2026-04-06 12:16:45 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.655078948 +0000 UTC m=+1191.642821814" lastFinishedPulling="2026-04-06 12:16:59.328076032 +0000 UTC m=+1198.315818898" observedRunningTime="2026-04-06 12:17:04.62311859 +0000 UTC m=+1203.610861476" watchObservedRunningTime="2026-04-06 12:17:04.634048492 +0000 UTC m=+1203.621791398" Apr 06 12:17:04 crc kubenswrapper[4790]: I0406 12:17:04.656784 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.829177614 podStartE2EDuration="20.656754317s" podCreationTimestamp="2026-04-06 12:16:44 +0000 UTC" firstStartedPulling="2026-04-06 12:16:53.940370685 +0000 UTC m=+1192.928113551" lastFinishedPulling="2026-04-06 12:17:03.767947388 +0000 UTC m=+1202.755690254" observedRunningTime="2026-04-06 12:17:04.650251569 +0000 UTC m=+1203.637994435" watchObservedRunningTime="2026-04-06 12:17:04.656754317 +0000 UTC m=+1203.644497223" Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.393542 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.437644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.600004 4790 generic.go:334] "Generic (PLEG): container finished" podID="6d6dc6ce-5627-454a-af1c-7a20bed8bfc4" containerID="85475fe50b07c3341a36c8eac26b68d83314151c8b24e94c5f02557dcacce4c7" exitCode=0 Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.600199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4","Type":"ContainerDied","Data":"85475fe50b07c3341a36c8eac26b68d83314151c8b24e94c5f02557dcacce4c7"} Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.600929 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.691072 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" path="/var/lib/kubelet/pods/d195497c-d743-4fef-a50e-b2b600bcc34f/volumes" Apr 06 12:17:05 crc kubenswrapper[4790]: I0406 12:17:05.692147 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Apr 06 12:17:06 crc kubenswrapper[4790]: I0406 12:17:06.248274 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Apr 06 12:17:06 crc kubenswrapper[4790]: I0406 12:17:06.611570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6d6dc6ce-5627-454a-af1c-7a20bed8bfc4","Type":"ContainerStarted","Data":"4d9adde36346d4cdff463c67a5661cbe8ab0365c312521b065983177ed118258"} Apr 06 12:17:06 crc kubenswrapper[4790]: I0406 12:17:06.635180 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.528323543 podStartE2EDuration="29.63516201s" podCreationTimestamp="2026-04-06 12:16:37 +0000 UTC" firstStartedPulling="2026-04-06 12:16:50.222129368 +0000 UTC m=+1189.209872234" lastFinishedPulling="2026-04-06 12:16:59.328967815 +0000 UTC m=+1198.316710701" observedRunningTime="2026-04-06 12:17:06.632268516 +0000 UTC m=+1205.620011382" watchObservedRunningTime="2026-04-06 12:17:06.63516201 +0000 UTC m=+1205.622904876" Apr 06 12:17:07 crc kubenswrapper[4790]: I0406 12:17:07.247879 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Apr 06 12:17:07 crc kubenswrapper[4790]: I0406 12:17:07.287030 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Apr 06 12:17:07 crc kubenswrapper[4790]: I0406 12:17:07.621174 4790 generic.go:334] "Generic (PLEG): container finished" podID="69e97903-5aa8-4523-ae3c-3f10b031ad20" containerID="b4c8d43b79adba42ec6dbcfb4527bf9652a90a30634ef7576d04b76547c7df25" exitCode=0 Apr 06 12:17:07 crc kubenswrapper[4790]: I0406 12:17:07.621266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69e97903-5aa8-4523-ae3c-3f10b031ad20","Type":"ContainerDied","Data":"b4c8d43b79adba42ec6dbcfb4527bf9652a90a30634ef7576d04b76547c7df25"} Apr 06 12:17:09 crc kubenswrapper[4790]: I0406 12:17:09.233570 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Apr 06 12:17:09 crc kubenswrapper[4790]: I0406 12:17:09.235334 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.100008 4790 generic.go:334] "Generic (PLEG): container finished" podID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerID="b76c9ae59236ea315c6cccadf1328f37ec5447b747f22a2a7693f56a1f29a416" exitCode=0 Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.100089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerDied","Data":"b76c9ae59236ea315c6cccadf1328f37ec5447b747f22a2a7693f56a1f29a416"} Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.102213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69e97903-5aa8-4523-ae3c-3f10b031ad20","Type":"ContainerStarted","Data":"4e3a0cefe63b54a9430bd562f3128df1d9035c4242ffc995dafad52a5aacc391"} Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.145493 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.002250823 podStartE2EDuration="32.145477738s" podCreationTimestamp="2026-04-06 12:16:38 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.372972166 +0000 UTC m=+1191.360715032" lastFinishedPulling="2026-04-06 12:16:59.516199081 +0000 UTC m=+1198.503941947" observedRunningTime="2026-04-06 12:17:10.142641465 +0000 UTC m=+1209.130384371" watchObservedRunningTime="2026-04-06 12:17:10.145477738 +0000 UTC m=+1209.133220604" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.388750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.388806 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.463857 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.722950 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:10 crc kubenswrapper[4790]: E0406 12:17:10.723671 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="init" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.723732 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="init" Apr 06 12:17:10 crc kubenswrapper[4790]: E0406 12:17:10.723803 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="dnsmasq-dns" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.723873 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="dnsmasq-dns" Apr 06 12:17:10 crc kubenswrapper[4790]: E0406 12:17:10.723940 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7474f479-358e-4ca8-8930-8ba9a763bb8c" containerName="init" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.723991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7474f479-358e-4ca8-8930-8ba9a763bb8c" containerName="init" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.724196 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7474f479-358e-4ca8-8930-8ba9a763bb8c" containerName="init" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.724258 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d195497c-d743-4fef-a50e-b2b600bcc34f" containerName="dnsmasq-dns" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.725136 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.727382 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.734093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.827170 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lw6ch"] Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.828412 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.830517 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.839085 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lw6ch"] Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.884528 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.884644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czslp\" (UniqueName: \"kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.884700 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.884876 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986646 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqbz\" (UniqueName: \"kubernetes.io/projected/77333973-0908-43d8-8105-0c3b3e5cdecb-kube-api-access-qbqbz\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czslp\" (UniqueName: \"kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77333973-0908-43d8-8105-0c3b3e5cdecb-config\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovn-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986818 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovs-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986851 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-combined-ca-bundle\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.986898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.987625 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.987821 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:10 crc kubenswrapper[4790]: I0406 12:17:10.988213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.004458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czslp\" (UniqueName: \"kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp\") pod \"dnsmasq-dns-5678ff6c5-2qpww\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.041414 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovs-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088757 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-combined-ca-bundle\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088855 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqbz\" (UniqueName: \"kubernetes.io/projected/77333973-0908-43d8-8105-0c3b3e5cdecb-kube-api-access-qbqbz\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088886 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77333973-0908-43d8-8105-0c3b3e5cdecb-config\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.088923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovn-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.089209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovn-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.089225 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/77333973-0908-43d8-8105-0c3b3e5cdecb-ovs-rundir\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.089607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77333973-0908-43d8-8105-0c3b3e5cdecb-config\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.096729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.096886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77333973-0908-43d8-8105-0c3b3e5cdecb-combined-ca-bundle\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.107464 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqbz\" (UniqueName: \"kubernetes.io/projected/77333973-0908-43d8-8105-0c3b3e5cdecb-kube-api-access-qbqbz\") pod \"ovn-controller-metrics-lw6ch\" (UID: \"77333973-0908-43d8-8105-0c3b3e5cdecb\") " pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.141853 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.183157 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lw6ch" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.187109 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.192452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.194570 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.196296 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.291547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.291622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.291667 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.291693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.291791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwg9j\" (UniqueName: \"kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.394427 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.394498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.394535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.394558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.396606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwg9j\" (UniqueName: \"kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.398088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.398664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.400650 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.401064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.422731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwg9j\" (UniqueName: \"kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j\") pod \"dnsmasq-dns-98b8f585c-j8r48\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.523138 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.579911 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:11 crc kubenswrapper[4790]: E0406 12:17:11.583343 4790 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:37472->38.102.83.146:40713: write tcp 38.102.83.146:37472->38.102.83.146:40713: write: broken pipe Apr 06 12:17:11 crc kubenswrapper[4790]: W0406 12:17:11.586404 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod331c58ba_dbbe_4073_b5d4_dbe4e2bf64e3.slice/crio-7e5b89d899953f9518c2f7991d6f94c5069854f6eddc3480a2a7abe6ac30ed6d WatchSource:0}: Error finding container 7e5b89d899953f9518c2f7991d6f94c5069854f6eddc3480a2a7abe6ac30ed6d: Status 404 returned error can't find the container with id 7e5b89d899953f9518c2f7991d6f94c5069854f6eddc3480a2a7abe6ac30ed6d Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.744878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lw6ch"] Apr 06 12:17:11 crc kubenswrapper[4790]: I0406 12:17:11.984102 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:11 crc kubenswrapper[4790]: W0406 12:17:11.986333 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4edbdb1c_e527_4f04_a2ec_34b3e3699c68.slice/crio-8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5 WatchSource:0}: Error finding container 8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5: Status 404 returned error can't find the container with id 8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5 Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.124727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" event={"ID":"4edbdb1c-e527-4f04-a2ec-34b3e3699c68","Type":"ContainerStarted","Data":"8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5"} Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.128243 4790 generic.go:334] "Generic (PLEG): container finished" podID="331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" containerID="a25108d562f0aafdc774b4755b11caa0796d55579283d1d0b1c4a57f4b62d0e4" exitCode=0 Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.128323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" event={"ID":"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3","Type":"ContainerDied","Data":"a25108d562f0aafdc774b4755b11caa0796d55579283d1d0b1c4a57f4b62d0e4"} Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.128350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" event={"ID":"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3","Type":"ContainerStarted","Data":"7e5b89d899953f9518c2f7991d6f94c5069854f6eddc3480a2a7abe6ac30ed6d"} Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.131440 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lw6ch" event={"ID":"77333973-0908-43d8-8105-0c3b3e5cdecb","Type":"ContainerStarted","Data":"4004b7f3d2cb2ded5b40a718c5ab321a4ab3026107f6400e9858856b969af157"} Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.131463 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lw6ch" event={"ID":"77333973-0908-43d8-8105-0c3b3e5cdecb","Type":"ContainerStarted","Data":"69012a89d424b7e637792bd66ef2708f22337d6ce998c86cee0b0962f1361070"} Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.175024 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lw6ch" podStartSLOduration=2.175005228 podStartE2EDuration="2.175005228s" podCreationTimestamp="2026-04-06 12:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:12.164204379 +0000 UTC m=+1211.151947245" watchObservedRunningTime="2026-04-06 12:17:12.175005228 +0000 UTC m=+1211.162748094" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.501663 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.615717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc\") pod \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.615900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czslp\" (UniqueName: \"kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp\") pod \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.615974 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb\") pod \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.616001 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config\") pod \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\" (UID: \"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3\") " Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.621066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp" (OuterVolumeSpecName: "kube-api-access-czslp") pod "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" (UID: "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3"). InnerVolumeSpecName "kube-api-access-czslp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.640608 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config" (OuterVolumeSpecName: "config") pod "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" (UID: "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.644255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" (UID: "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.645602 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" (UID: "331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.718445 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.718477 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czslp\" (UniqueName: \"kubernetes.io/projected/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-kube-api-access-czslp\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.718489 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.718499 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.890710 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.908555 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.931898 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:17:12 crc kubenswrapper[4790]: E0406 12:17:12.932282 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" containerName="init" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.932298 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" containerName="init" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.932479 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" containerName="init" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.933770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:12 crc kubenswrapper[4790]: I0406 12:17:12.970270 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.024810 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw8f\" (UniqueName: \"kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.024897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.024947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.024996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.025015 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.127229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.127289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.127348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw8f\" (UniqueName: \"kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.127389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.127432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.128399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.128893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.129409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.130066 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.164906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" event={"ID":"331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3","Type":"ContainerDied","Data":"7e5b89d899953f9518c2f7991d6f94c5069854f6eddc3480a2a7abe6ac30ed6d"} Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.164974 4790 scope.go:117] "RemoveContainer" containerID="a25108d562f0aafdc774b4755b11caa0796d55579283d1d0b1c4a57f4b62d0e4" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.165160 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678ff6c5-2qpww" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.167856 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw8f\" (UniqueName: \"kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f\") pod \"dnsmasq-dns-756fdd77c5-2qxts\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.182063 4790 generic.go:334] "Generic (PLEG): container finished" podID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" containerID="8942ea26ec40c2b285bc6cc20e4183c7021396210b7d807d9bf67043117765c3" exitCode=0 Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.182128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" event={"ID":"4edbdb1c-e527-4f04-a2ec-34b3e3699c68","Type":"ContainerDied","Data":"8942ea26ec40c2b285bc6cc20e4183c7021396210b7d807d9bf67043117765c3"} Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.242410 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.250091 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5678ff6c5-2qpww"] Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.257239 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.351636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Apr 06 12:17:13 crc kubenswrapper[4790]: E0406 12:17:13.527009 4790 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Apr 06 12:17:13 crc kubenswrapper[4790]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4edbdb1c-e527-4f04-a2ec-34b3e3699c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Apr 06 12:17:13 crc kubenswrapper[4790]: > podSandboxID="8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5" Apr 06 12:17:13 crc kubenswrapper[4790]: E0406 12:17:13.527542 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Apr 06 12:17:13 crc kubenswrapper[4790]: container &Container{Name:dnsmasq-dns,Image:38.102.83.94:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d4h5c7h55bh7ch5f4hd7h687h5dch586hc4h59bh668h66fh67h64dh696hd9h7h654h645h88h5h697hdfh685hc8h64fh5c5h698hc7h687h5fbq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwg9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-98b8f585c-j8r48_openstack(4edbdb1c-e527-4f04-a2ec-34b3e3699c68): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4edbdb1c-e527-4f04-a2ec-34b3e3699c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Apr 06 12:17:13 crc kubenswrapper[4790]: > logger="UnhandledError" Apr 06 12:17:13 crc kubenswrapper[4790]: E0406 12:17:13.529430 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4edbdb1c-e527-4f04-a2ec-34b3e3699c68/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" podUID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.569630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.685614 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3" path="/var/lib/kubelet/pods/331c58ba-dbbe-4073-b5d4-dbe4e2bf64e3/volumes" Apr 06 12:17:13 crc kubenswrapper[4790]: I0406 12:17:13.774671 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:17:13 crc kubenswrapper[4790]: W0406 12:17:13.790057 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c17f148_f3a8_4e44_b9a0_ea95774a05f1.slice/crio-6f7aaa2be1d4d445ce70daf6a5be39057c593681e7b519b9a662a2cb03e18104 WatchSource:0}: Error finding container 6f7aaa2be1d4d445ce70daf6a5be39057c593681e7b519b9a662a2cb03e18104: Status 404 returned error can't find the container with id 6f7aaa2be1d4d445ce70daf6a5be39057c593681e7b519b9a662a2cb03e18104 Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.063520 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.069608 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.071683 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.072048 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.072372 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.073626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-t8wpn" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.089121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmdc\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-kube-api-access-8dmdc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a39c210-842a-4286-8770-a84bbfec54a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151287 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-lock\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.151558 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-cache\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.192663 4790 generic.go:334] "Generic (PLEG): container finished" podID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerID="60391508f559b446e6a82980edd2d11946bf8c76e11164d97fb3336cf9cbb67a" exitCode=0 Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.192739 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" event={"ID":"5c17f148-f3a8-4e44-b9a0-ea95774a05f1","Type":"ContainerDied","Data":"60391508f559b446e6a82980edd2d11946bf8c76e11164d97fb3336cf9cbb67a"} Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.192765 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" event={"ID":"5c17f148-f3a8-4e44-b9a0-ea95774a05f1","Type":"ContainerStarted","Data":"6f7aaa2be1d4d445ce70daf6a5be39057c593681e7b519b9a662a2cb03e18104"} Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.252543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a39c210-842a-4286-8770-a84bbfec54a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.252838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-lock\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.252953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.253093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.253244 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-cache\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.253344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmdc\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-kube-api-access-8dmdc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.253751 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.253851 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.253991 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:14.753974803 +0000 UTC m=+1213.741717669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.254548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-lock\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.255149 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.257678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3a39c210-842a-4286-8770-a84bbfec54a0-cache\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.258378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a39c210-842a-4286-8770-a84bbfec54a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.274820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmdc\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-kube-api-access-8dmdc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.303592 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.511667 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.660139 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config\") pod \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.660214 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc\") pod \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.660261 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwg9j\" (UniqueName: \"kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j\") pod \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.660286 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb\") pod \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.660370 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb\") pod \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\" (UID: \"4edbdb1c-e527-4f04-a2ec-34b3e3699c68\") " Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.666975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j" (OuterVolumeSpecName: "kube-api-access-hwg9j") pod "4edbdb1c-e527-4f04-a2ec-34b3e3699c68" (UID: "4edbdb1c-e527-4f04-a2ec-34b3e3699c68"). InnerVolumeSpecName "kube-api-access-hwg9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.704667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4edbdb1c-e527-4f04-a2ec-34b3e3699c68" (UID: "4edbdb1c-e527-4f04-a2ec-34b3e3699c68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.704789 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config" (OuterVolumeSpecName: "config") pod "4edbdb1c-e527-4f04-a2ec-34b3e3699c68" (UID: "4edbdb1c-e527-4f04-a2ec-34b3e3699c68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.713516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4edbdb1c-e527-4f04-a2ec-34b3e3699c68" (UID: "4edbdb1c-e527-4f04-a2ec-34b3e3699c68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.718488 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4edbdb1c-e527-4f04-a2ec-34b3e3699c68" (UID: "4edbdb1c-e527-4f04-a2ec-34b3e3699c68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.762438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.762718 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.762877 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.762909 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.762883 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:14 crc kubenswrapper[4790]: E0406 12:17:14.762965 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:15.762943791 +0000 UTC m=+1214.750686757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.763096 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwg9j\" (UniqueName: \"kubernetes.io/projected/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-kube-api-access-hwg9j\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.763118 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:14 crc kubenswrapper[4790]: I0406 12:17:14.763129 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4edbdb1c-e527-4f04-a2ec-34b3e3699c68-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.205653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" event={"ID":"5c17f148-f3a8-4e44-b9a0-ea95774a05f1","Type":"ContainerStarted","Data":"4dab267bc571528b186b4b0804889da2f9db40186527f7a43ba51afcd2709a0f"} Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.205794 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.207847 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" event={"ID":"4edbdb1c-e527-4f04-a2ec-34b3e3699c68","Type":"ContainerDied","Data":"8f154e12283fcf88b04fdf4ad36fb89d25dc8f1b547b24ab10b32bff3e5bd1a5"} Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.207875 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98b8f585c-j8r48" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.207897 4790 scope.go:117] "RemoveContainer" containerID="8942ea26ec40c2b285bc6cc20e4183c7021396210b7d807d9bf67043117765c3" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.223817 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" podStartSLOduration=3.22380123 podStartE2EDuration="3.22380123s" podCreationTimestamp="2026-04-06 12:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:15.222457785 +0000 UTC m=+1214.210200651" watchObservedRunningTime="2026-04-06 12:17:15.22380123 +0000 UTC m=+1214.211544096" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.317164 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.325884 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98b8f585c-j8r48"] Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.689128 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" path="/var/lib/kubelet/pods/4edbdb1c-e527-4f04-a2ec-34b3e3699c68/volumes" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.778283 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:15 crc kubenswrapper[4790]: E0406 12:17:15.778396 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:15 crc kubenswrapper[4790]: E0406 12:17:15.778410 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:15 crc kubenswrapper[4790]: E0406 12:17:15.778472 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:17.778459506 +0000 UTC m=+1216.766202372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.933487 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a74a-account-create-update-ldtlg"] Apr 06 12:17:15 crc kubenswrapper[4790]: E0406 12:17:15.934180 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" containerName="init" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.934200 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" containerName="init" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.934415 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edbdb1c-e527-4f04-a2ec-34b3e3699c68" containerName="init" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.935615 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.937955 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.953011 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a74a-account-create-update-ldtlg"] Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.985103 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4rwbs"] Apr 06 12:17:15 crc kubenswrapper[4790]: I0406 12:17:15.987534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.008457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4rwbs"] Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.082424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdx9w\" (UniqueName: \"kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.082858 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.185033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.185127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ptc8\" (UniqueName: \"kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.185169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.185263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdx9w\" (UniqueName: \"kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.186266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.212137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdx9w\" (UniqueName: \"kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w\") pod \"glance-a74a-account-create-update-ldtlg\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.257344 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.286105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptc8\" (UniqueName: \"kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.286162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.287204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.304384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ptc8\" (UniqueName: \"kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8\") pod \"glance-db-create-4rwbs\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.309999 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.341441 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.476719 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.478366 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.482908 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.482971 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.483176 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jkg7l" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.483241 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.488660 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.541405 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.590571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.590745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wkk\" (UniqueName: \"kubernetes.io/projected/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-kube-api-access-26wkk\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.590880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.590961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.591011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-config\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.591093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-scripts\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.591117 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.682396 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.692679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wkk\" (UniqueName: \"kubernetes.io/projected/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-kube-api-access-26wkk\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.692794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.692877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.692923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-config\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.692982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-scripts\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.693008 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.693045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.693599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.695060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-scripts\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.695318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-config\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.698003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.702130 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.712485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.717839 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wkk\" (UniqueName: \"kubernetes.io/projected/c66b653e-0e7d-44cb-82e1-1e2ee6a04b15-kube-api-access-26wkk\") pod \"ovn-northd-0\" (UID: \"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15\") " pod="openstack/ovn-northd-0" Apr 06 12:17:16 crc kubenswrapper[4790]: I0406 12:17:16.804491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.560462 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-m5pl7"] Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.562056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.567780 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m5pl7"] Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.601409 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.710186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.710239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsh9\" (UniqueName: \"kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.812198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.812248 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsh9\" (UniqueName: \"kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.812286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:17 crc kubenswrapper[4790]: E0406 12:17:17.812516 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:17 crc kubenswrapper[4790]: E0406 12:17:17.812548 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:17 crc kubenswrapper[4790]: E0406 12:17:17.812610 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:21.812592466 +0000 UTC m=+1220.800335332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.813570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.828665 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsh9\" (UniqueName: \"kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9\") pod \"root-account-create-update-m5pl7\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:17 crc kubenswrapper[4790]: I0406 12:17:17.917218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.000635 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-x6pwr"] Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.002148 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.004055 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.004270 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.004427 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.008281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-x6pwr"] Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.119623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.119901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.119997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.120164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.120224 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.120301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.120397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8vvc\" (UniqueName: \"kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.222876 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.222969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.223001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.223043 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.223071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8vvc\" (UniqueName: \"kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.223130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.223165 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.224019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.224501 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.225473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.228335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.228533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.240917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.245816 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8vvc\" (UniqueName: \"kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc\") pod \"swift-ring-rebalance-x6pwr\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:18 crc kubenswrapper[4790]: I0406 12:17:18.318658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:19 crc kubenswrapper[4790]: I0406 12:17:19.605753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-x6pwr"] Apr 06 12:17:19 crc kubenswrapper[4790]: I0406 12:17:19.689811 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m5pl7"] Apr 06 12:17:19 crc kubenswrapper[4790]: W0406 12:17:19.697588 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052ec8fa_ee10_4d35_8dee_a61dc66d2352.slice/crio-6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa WatchSource:0}: Error finding container 6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa: Status 404 returned error can't find the container with id 6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa Apr 06 12:17:19 crc kubenswrapper[4790]: I0406 12:17:19.733559 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4rwbs"] Apr 06 12:17:19 crc kubenswrapper[4790]: I0406 12:17:19.740053 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Apr 06 12:17:19 crc kubenswrapper[4790]: W0406 12:17:19.742587 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode76ef053_b579_43cb_a526_549fca65c4ba.slice/crio-a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13 WatchSource:0}: Error finding container a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13: Status 404 returned error can't find the container with id a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13 Apr 06 12:17:19 crc kubenswrapper[4790]: I0406 12:17:19.752205 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a74a-account-create-update-ldtlg"] Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.278779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerStarted","Data":"908c302d97542aaf5486ee3f5060ce8e5037b5ff8223ddd5fa333db147c36488"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.280942 4790 generic.go:334] "Generic (PLEG): container finished" podID="c704f80e-3430-4e2d-af8b-4900da5743bd" containerID="3b31fe5a4c7ccb0ce7387369edd8c2a5c32c202903afc461cc1a675c556ddf89" exitCode=0 Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.281013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m5pl7" event={"ID":"c704f80e-3430-4e2d-af8b-4900da5743bd","Type":"ContainerDied","Data":"3b31fe5a4c7ccb0ce7387369edd8c2a5c32c202903afc461cc1a675c556ddf89"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.281042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m5pl7" event={"ID":"c704f80e-3430-4e2d-af8b-4900da5743bd","Type":"ContainerStarted","Data":"ccc7f8ea7261825a0554f869f866e15f85382478cdd0963c4576e2861ce7833f"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.282887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6pwr" event={"ID":"8288b902-f791-4dce-b1c0-2afa8796712b","Type":"ContainerStarted","Data":"6200b3d07be36a19d48691c7988c4ecb333973d8e9867a25bdf8ffae5cb25539"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.284570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15","Type":"ContainerStarted","Data":"d18b9e24bc7ab86e06b665c8edb6fd9f319feba5e6a194c46a0f0964e2442bb4"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.288173 4790 generic.go:334] "Generic (PLEG): container finished" podID="e76ef053-b579-43cb-a526-549fca65c4ba" containerID="3e116411d90b29e20115218f9adcb2ca8c89d186dd31869fba2958425700d995" exitCode=0 Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.288227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a74a-account-create-update-ldtlg" event={"ID":"e76ef053-b579-43cb-a526-549fca65c4ba","Type":"ContainerDied","Data":"3e116411d90b29e20115218f9adcb2ca8c89d186dd31869fba2958425700d995"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.288244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a74a-account-create-update-ldtlg" event={"ID":"e76ef053-b579-43cb-a526-549fca65c4ba","Type":"ContainerStarted","Data":"a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.290521 4790 generic.go:334] "Generic (PLEG): container finished" podID="052ec8fa-ee10-4d35-8dee-a61dc66d2352" containerID="1b0ee3303c81cc98414b8eada8756466caa1eca161596257d2b7f4690bc602ee" exitCode=0 Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.290743 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4rwbs" event={"ID":"052ec8fa-ee10-4d35-8dee-a61dc66d2352","Type":"ContainerDied","Data":"1b0ee3303c81cc98414b8eada8756466caa1eca161596257d2b7f4690bc602ee"} Apr 06 12:17:20 crc kubenswrapper[4790]: I0406 12:17:20.290874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4rwbs" event={"ID":"052ec8fa-ee10-4d35-8dee-a61dc66d2352","Type":"ContainerStarted","Data":"6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa"} Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.673696 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fkm99"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.675042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.691258 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fkm99"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.784934 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-aea6-account-create-update-vpgm4"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.786040 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.787987 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.793862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clcwz\" (UniqueName: \"kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.793978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.794035 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aea6-account-create-update-vpgm4"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.895934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.896047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.896098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clcwz\" (UniqueName: \"kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.896180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds9xv\" (UniqueName: \"kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.896220 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:21 crc kubenswrapper[4790]: E0406 12:17:21.896416 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:21 crc kubenswrapper[4790]: E0406 12:17:21.896439 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:21 crc kubenswrapper[4790]: E0406 12:17:21.896491 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:29.896474666 +0000 UTC m=+1228.884217532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.897170 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.915696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clcwz\" (UniqueName: \"kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz\") pod \"keystone-db-create-fkm99\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.956312 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-l7zf9"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.957503 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.979266 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l7zf9"] Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.997973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.998098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds9xv\" (UniqueName: \"kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:21 crc kubenswrapper[4790]: I0406 12:17:21.998780 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.013408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds9xv\" (UniqueName: \"kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv\") pod \"keystone-aea6-account-create-update-vpgm4\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.023327 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.075123 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ae49-account-create-update-nwkqx"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.076224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.080074 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.094379 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ae49-account-create-update-nwkqx"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.103901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.105426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.105571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vk7q\" (UniqueName: \"kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.207021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.207209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vk7q\" (UniqueName: \"kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.207266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxb65\" (UniqueName: \"kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.207312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.207871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.225453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vk7q\" (UniqueName: \"kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q\") pod \"placement-db-create-l7zf9\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.286347 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.310135 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.311331 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxb65\" (UniqueName: \"kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.311386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.312704 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.315415 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.317242 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.343398 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxb65\" (UniqueName: \"kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65\") pod \"placement-ae49-account-create-update-nwkqx\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.345898 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a74a-account-create-update-ldtlg" event={"ID":"e76ef053-b579-43cb-a526-549fca65c4ba","Type":"ContainerDied","Data":"a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13"} Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.345937 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90d8baede8d3723fcdaec8b7a5428858dc916d6c2d6a220b26a3f3db1f46f13" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.345997 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a74a-account-create-update-ldtlg" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.385427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4rwbs" event={"ID":"052ec8fa-ee10-4d35-8dee-a61dc66d2352","Type":"ContainerDied","Data":"6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa"} Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.385471 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be83f73734b02f7cae87cc97be46c58571ae1509250a7774093359f95016bfa" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.385549 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4rwbs" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.398077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerStarted","Data":"faf8af00f6013bd18261e1a18cc7b32f0417e5d789683599d2481fe91e885ced"} Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.402932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m5pl7" event={"ID":"c704f80e-3430-4e2d-af8b-4900da5743bd","Type":"ContainerDied","Data":"ccc7f8ea7261825a0554f869f866e15f85382478cdd0963c4576e2861ce7833f"} Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.402968 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc7f8ea7261825a0554f869f866e15f85382478cdd0963c4576e2861ce7833f" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.403023 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m5pl7" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.403473 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsh9\" (UniqueName: \"kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9\") pod \"c704f80e-3430-4e2d-af8b-4900da5743bd\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414475 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts\") pod \"e76ef053-b579-43cb-a526-549fca65c4ba\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414544 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdx9w\" (UniqueName: \"kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w\") pod \"e76ef053-b579-43cb-a526-549fca65c4ba\" (UID: \"e76ef053-b579-43cb-a526-549fca65c4ba\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts\") pod \"c704f80e-3430-4e2d-af8b-4900da5743bd\" (UID: \"c704f80e-3430-4e2d-af8b-4900da5743bd\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ptc8\" (UniqueName: \"kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8\") pod \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.414709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts\") pod \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\" (UID: \"052ec8fa-ee10-4d35-8dee-a61dc66d2352\") " Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.415982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "052ec8fa-ee10-4d35-8dee-a61dc66d2352" (UID: "052ec8fa-ee10-4d35-8dee-a61dc66d2352"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.416309 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c704f80e-3430-4e2d-af8b-4900da5743bd" (UID: "c704f80e-3430-4e2d-af8b-4900da5743bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.416635 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e76ef053-b579-43cb-a526-549fca65c4ba" (UID: "e76ef053-b579-43cb-a526-549fca65c4ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.422244 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w" (OuterVolumeSpecName: "kube-api-access-mdx9w") pod "e76ef053-b579-43cb-a526-549fca65c4ba" (UID: "e76ef053-b579-43cb-a526-549fca65c4ba"). InnerVolumeSpecName "kube-api-access-mdx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.422497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8" (OuterVolumeSpecName: "kube-api-access-9ptc8") pod "052ec8fa-ee10-4d35-8dee-a61dc66d2352" (UID: "052ec8fa-ee10-4d35-8dee-a61dc66d2352"). InnerVolumeSpecName "kube-api-access-9ptc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.423619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9" (OuterVolumeSpecName: "kube-api-access-6xsh9") pod "c704f80e-3430-4e2d-af8b-4900da5743bd" (UID: "c704f80e-3430-4e2d-af8b-4900da5743bd"). InnerVolumeSpecName "kube-api-access-6xsh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516692 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/052ec8fa-ee10-4d35-8dee-a61dc66d2352-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516720 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsh9\" (UniqueName: \"kubernetes.io/projected/c704f80e-3430-4e2d-af8b-4900da5743bd-kube-api-access-6xsh9\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516733 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e76ef053-b579-43cb-a526-549fca65c4ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516743 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdx9w\" (UniqueName: \"kubernetes.io/projected/e76ef053-b579-43cb-a526-549fca65c4ba-kube-api-access-mdx9w\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516753 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c704f80e-3430-4e2d-af8b-4900da5743bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.516762 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ptc8\" (UniqueName: \"kubernetes.io/projected/052ec8fa-ee10-4d35-8dee-a61dc66d2352-kube-api-access-9ptc8\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.685888 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fkm99"] Apr 06 12:17:22 crc kubenswrapper[4790]: W0406 12:17:22.694102 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode25bee7c_6b29_4d5d_856a_0174919c831c.slice/crio-87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd WatchSource:0}: Error finding container 87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd: Status 404 returned error can't find the container with id 87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.771212 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aea6-account-create-update-vpgm4"] Apr 06 12:17:22 crc kubenswrapper[4790]: W0406 12:17:22.790291 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod328ef813_cbdd_4e37_901c_2c998a9d7edd.slice/crio-27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5 WatchSource:0}: Error finding container 27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5: Status 404 returned error can't find the container with id 27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5 Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.872990 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l7zf9"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.885192 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-zlwz2"] Apr 06 12:17:22 crc kubenswrapper[4790]: E0406 12:17:22.885688 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c704f80e-3430-4e2d-af8b-4900da5743bd" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.885713 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c704f80e-3430-4e2d-af8b-4900da5743bd" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: E0406 12:17:22.885743 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76ef053-b579-43cb-a526-549fca65c4ba" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.885752 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76ef053-b579-43cb-a526-549fca65c4ba" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: E0406 12:17:22.885767 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="052ec8fa-ee10-4d35-8dee-a61dc66d2352" containerName="mariadb-database-create" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.885776 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="052ec8fa-ee10-4d35-8dee-a61dc66d2352" containerName="mariadb-database-create" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.886209 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c704f80e-3430-4e2d-af8b-4900da5743bd" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.886245 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76ef053-b579-43cb-a526-549fca65c4ba" containerName="mariadb-account-create-update" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.886259 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="052ec8fa-ee10-4d35-8dee-a61dc66d2352" containerName="mariadb-database-create" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.886993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.896205 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-zlwz2"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.940021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ae49-account-create-update-nwkqx"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.986409 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-42cd-account-create-update-9r9j2"] Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.987540 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.989708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Apr 06 12:17:22 crc kubenswrapper[4790]: I0406 12:17:22.993470 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-42cd-account-create-update-9r9j2"] Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.027770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.027856 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xc5\" (UniqueName: \"kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.129170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.129234 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xc5\" (UniqueName: \"kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.129298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b89r\" (UniqueName: \"kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.129325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.129871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.150596 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xc5\" (UniqueName: \"kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5\") pod \"watcher-db-create-zlwz2\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.212639 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.231169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b89r\" (UniqueName: \"kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.231217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.231911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.249669 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b89r\" (UniqueName: \"kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r\") pod \"watcher-42cd-account-create-update-9r9j2\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.262148 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.321329 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.321964 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="dnsmasq-dns" containerID="cri-o://97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07" gracePeriod=10 Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.394600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.435552 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ae49-account-create-update-nwkqx" event={"ID":"ef90943c-ee17-4ba6-8047-93614a407a86","Type":"ContainerStarted","Data":"0ac3e2f64247693905cc500679c7bb654011e539138df8eea905e2487a90383e"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.435627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ae49-account-create-update-nwkqx" event={"ID":"ef90943c-ee17-4ba6-8047-93614a407a86","Type":"ContainerStarted","Data":"a7c5686dfdf298b5ae915f4e35e384386bd42ecd7bdd095e43abc0195ceb496d"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.444781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l7zf9" event={"ID":"3604131a-98c3-46f0-9add-fd3e919f18c1","Type":"ContainerStarted","Data":"a414b99455b1f8d7772f38f8647dd46f0eda75c50716a8e6288769a3b661ef8a"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.444823 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l7zf9" event={"ID":"3604131a-98c3-46f0-9add-fd3e919f18c1","Type":"ContainerStarted","Data":"da7998786f46c682051439227b924d74cd88c95213b82d134c0ad0df5781eead"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.461667 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ae49-account-create-update-nwkqx" podStartSLOduration=1.461458424 podStartE2EDuration="1.461458424s" podCreationTimestamp="2026-04-06 12:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:23.454069063 +0000 UTC m=+1222.441811939" watchObservedRunningTime="2026-04-06 12:17:23.461458424 +0000 UTC m=+1222.449201290" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.470965 4790 generic.go:334] "Generic (PLEG): container finished" podID="e25bee7c-6b29-4d5d-856a-0174919c831c" containerID="2e1e340d49ce6e2f30c4ef4429e90a42952595608a29426837d51f7ea7512c06" exitCode=0 Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.471063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkm99" event={"ID":"e25bee7c-6b29-4d5d-856a-0174919c831c","Type":"ContainerDied","Data":"2e1e340d49ce6e2f30c4ef4429e90a42952595608a29426837d51f7ea7512c06"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.471123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkm99" event={"ID":"e25bee7c-6b29-4d5d-856a-0174919c831c","Type":"ContainerStarted","Data":"87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.481708 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-l7zf9" podStartSLOduration=2.481681335 podStartE2EDuration="2.481681335s" podCreationTimestamp="2026-04-06 12:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:23.474168691 +0000 UTC m=+1222.461911547" watchObservedRunningTime="2026-04-06 12:17:23.481681335 +0000 UTC m=+1222.469424201" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.494606 4790 generic.go:334] "Generic (PLEG): container finished" podID="328ef813-cbdd-4e37-901c-2c998a9d7edd" containerID="e971f694ca3b2feff568fc417528c6ab6663185c98517301e6b3f35116226764" exitCode=0 Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.494708 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aea6-account-create-update-vpgm4" event={"ID":"328ef813-cbdd-4e37-901c-2c998a9d7edd","Type":"ContainerDied","Data":"e971f694ca3b2feff568fc417528c6ab6663185c98517301e6b3f35116226764"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.494738 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aea6-account-create-update-vpgm4" event={"ID":"328ef813-cbdd-4e37-901c-2c998a9d7edd","Type":"ContainerStarted","Data":"27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.503606 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6pwr" event={"ID":"8288b902-f791-4dce-b1c0-2afa8796712b","Type":"ContainerStarted","Data":"8bb43e3b032cf92c6ea6a72dd9fc8b59adb2c5a29f2567fe153e85044ab65a6c"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.522488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15","Type":"ContainerStarted","Data":"05fab0db448eb996adfa1cce56e26e9c08f6bea7db964ef129b781d28194dd61"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.522527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c66b653e-0e7d-44cb-82e1-1e2ee6a04b15","Type":"ContainerStarted","Data":"17fbc09d105444caa52c7792514e1447acd94f9801cb3f1451135cff5944e03a"} Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.523188 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.557830 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-x6pwr" podStartSLOduration=3.942811226 podStartE2EDuration="6.557810727s" podCreationTimestamp="2026-04-06 12:17:17 +0000 UTC" firstStartedPulling="2026-04-06 12:17:19.616183772 +0000 UTC m=+1218.603926638" lastFinishedPulling="2026-04-06 12:17:22.231183273 +0000 UTC m=+1221.218926139" observedRunningTime="2026-04-06 12:17:23.546040904 +0000 UTC m=+1222.533783770" watchObservedRunningTime="2026-04-06 12:17:23.557810727 +0000 UTC m=+1222.545553593" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.603063 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.100336457 podStartE2EDuration="7.603008732s" podCreationTimestamp="2026-04-06 12:17:16 +0000 UTC" firstStartedPulling="2026-04-06 12:17:19.727733578 +0000 UTC m=+1218.715476444" lastFinishedPulling="2026-04-06 12:17:22.230405853 +0000 UTC m=+1221.218148719" observedRunningTime="2026-04-06 12:17:23.574800775 +0000 UTC m=+1222.562543661" watchObservedRunningTime="2026-04-06 12:17:23.603008732 +0000 UTC m=+1222.590751598" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.779104 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-zlwz2"] Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.811919 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.950809 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-42cd-account-create-update-9r9j2"] Apr 06 12:17:23 crc kubenswrapper[4790]: W0406 12:17:23.958447 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c11a7e_145d_4f36_bb18_5ba87172fb2a.slice/crio-9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730 WatchSource:0}: Error finding container 9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730: Status 404 returned error can't find the container with id 9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730 Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.959936 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc\") pod \"290daa37-5ba2-47fc-841b-d89aed9745d2\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.960077 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjfmh\" (UniqueName: \"kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh\") pod \"290daa37-5ba2-47fc-841b-d89aed9745d2\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.960163 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config\") pod \"290daa37-5ba2-47fc-841b-d89aed9745d2\" (UID: \"290daa37-5ba2-47fc-841b-d89aed9745d2\") " Apr 06 12:17:23 crc kubenswrapper[4790]: I0406 12:17:23.967615 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh" (OuterVolumeSpecName: "kube-api-access-tjfmh") pod "290daa37-5ba2-47fc-841b-d89aed9745d2" (UID: "290daa37-5ba2-47fc-841b-d89aed9745d2"). InnerVolumeSpecName "kube-api-access-tjfmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.027917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "290daa37-5ba2-47fc-841b-d89aed9745d2" (UID: "290daa37-5ba2-47fc-841b-d89aed9745d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.028529 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config" (OuterVolumeSpecName: "config") pod "290daa37-5ba2-47fc-841b-d89aed9745d2" (UID: "290daa37-5ba2-47fc-841b-d89aed9745d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.063873 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjfmh\" (UniqueName: \"kubernetes.io/projected/290daa37-5ba2-47fc-841b-d89aed9745d2-kube-api-access-tjfmh\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.063920 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.063932 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/290daa37-5ba2-47fc-841b-d89aed9745d2-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.162174 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-m5pl7"] Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.168388 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-m5pl7"] Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.542481 4790 generic.go:334] "Generic (PLEG): container finished" podID="ef90943c-ee17-4ba6-8047-93614a407a86" containerID="0ac3e2f64247693905cc500679c7bb654011e539138df8eea905e2487a90383e" exitCode=0 Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.542551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ae49-account-create-update-nwkqx" event={"ID":"ef90943c-ee17-4ba6-8047-93614a407a86","Type":"ContainerDied","Data":"0ac3e2f64247693905cc500679c7bb654011e539138df8eea905e2487a90383e"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.547179 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4c11a7e-145d-4f36-bb18-5ba87172fb2a" containerID="978fbdeddd74a97d8aa9eea5a0461790857816b474e9043e247635dcefda85fc" exitCode=0 Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.547241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-42cd-account-create-update-9r9j2" event={"ID":"f4c11a7e-145d-4f36-bb18-5ba87172fb2a","Type":"ContainerDied","Data":"978fbdeddd74a97d8aa9eea5a0461790857816b474e9043e247635dcefda85fc"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.547272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-42cd-account-create-update-9r9j2" event={"ID":"f4c11a7e-145d-4f36-bb18-5ba87172fb2a","Type":"ContainerStarted","Data":"9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.553283 4790 generic.go:334] "Generic (PLEG): container finished" podID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerID="97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07" exitCode=0 Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.553348 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" event={"ID":"290daa37-5ba2-47fc-841b-d89aed9745d2","Type":"ContainerDied","Data":"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.553377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" event={"ID":"290daa37-5ba2-47fc-841b-d89aed9745d2","Type":"ContainerDied","Data":"ab94057a987b8ee442f7e97faec709d1fa2fd2d911503e7488583c3f539feb8e"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.553393 4790 scope.go:117] "RemoveContainer" containerID="97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.553504 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fccb695c7-r6qkc" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.569098 4790 generic.go:334] "Generic (PLEG): container finished" podID="3604131a-98c3-46f0-9add-fd3e919f18c1" containerID="a414b99455b1f8d7772f38f8647dd46f0eda75c50716a8e6288769a3b661ef8a" exitCode=0 Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.569226 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l7zf9" event={"ID":"3604131a-98c3-46f0-9add-fd3e919f18c1","Type":"ContainerDied","Data":"a414b99455b1f8d7772f38f8647dd46f0eda75c50716a8e6288769a3b661ef8a"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.572052 4790 generic.go:334] "Generic (PLEG): container finished" podID="99e3e70d-1789-45a0-84f8-1423e049abf1" containerID="bfb799b45231431ff9a81c2a29edf9f651b6745af01f5934392bb91a557ec943" exitCode=0 Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.572124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-zlwz2" event={"ID":"99e3e70d-1789-45a0-84f8-1423e049abf1","Type":"ContainerDied","Data":"bfb799b45231431ff9a81c2a29edf9f651b6745af01f5934392bb91a557ec943"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.572157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-zlwz2" event={"ID":"99e3e70d-1789-45a0-84f8-1423e049abf1","Type":"ContainerStarted","Data":"b0cd2c6fe5bc4915ea187cc4d9a8c14aa5aa3449c0b6dd136147b979421e1d95"} Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.592285 4790 scope.go:117] "RemoveContainer" containerID="d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.608004 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.620571 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fccb695c7-r6qkc"] Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.620644 4790 scope.go:117] "RemoveContainer" containerID="97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07" Apr 06 12:17:24 crc kubenswrapper[4790]: E0406 12:17:24.625545 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07\": container with ID starting with 97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07 not found: ID does not exist" containerID="97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.625591 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07"} err="failed to get container status \"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07\": rpc error: code = NotFound desc = could not find container \"97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07\": container with ID starting with 97ce254a16fadb96f09f2e5f8a11b38851cf0a23193367b1682e4b4f9eaa3d07 not found: ID does not exist" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.625614 4790 scope.go:117] "RemoveContainer" containerID="d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211" Apr 06 12:17:24 crc kubenswrapper[4790]: E0406 12:17:24.625929 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211\": container with ID starting with d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211 not found: ID does not exist" containerID="d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211" Apr 06 12:17:24 crc kubenswrapper[4790]: I0406 12:17:24.625963 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211"} err="failed to get container status \"d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211\": rpc error: code = NotFound desc = could not find container \"d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211\": container with ID starting with d48711771439ae9afd30448567fe5dda4ba844f7fe8e28979ea64c831b48f211 not found: ID does not exist" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.699894 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" path="/var/lib/kubelet/pods/290daa37-5ba2-47fc-841b-d89aed9745d2/volumes" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.700672 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c704f80e-3430-4e2d-af8b-4900da5743bd" path="/var/lib/kubelet/pods/c704f80e-3430-4e2d-af8b-4900da5743bd/volumes" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.762549 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.763265 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.899087 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts\") pod \"328ef813-cbdd-4e37-901c-2c998a9d7edd\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.899533 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clcwz\" (UniqueName: \"kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz\") pod \"e25bee7c-6b29-4d5d-856a-0174919c831c\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.899565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts\") pod \"e25bee7c-6b29-4d5d-856a-0174919c831c\" (UID: \"e25bee7c-6b29-4d5d-856a-0174919c831c\") " Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.899647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds9xv\" (UniqueName: \"kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv\") pod \"328ef813-cbdd-4e37-901c-2c998a9d7edd\" (UID: \"328ef813-cbdd-4e37-901c-2c998a9d7edd\") " Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.900371 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "328ef813-cbdd-4e37-901c-2c998a9d7edd" (UID: "328ef813-cbdd-4e37-901c-2c998a9d7edd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.900614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e25bee7c-6b29-4d5d-856a-0174919c831c" (UID: "e25bee7c-6b29-4d5d-856a-0174919c831c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.905333 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv" (OuterVolumeSpecName: "kube-api-access-ds9xv") pod "328ef813-cbdd-4e37-901c-2c998a9d7edd" (UID: "328ef813-cbdd-4e37-901c-2c998a9d7edd"). InnerVolumeSpecName "kube-api-access-ds9xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:25 crc kubenswrapper[4790]: I0406 12:17:25.908764 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz" (OuterVolumeSpecName: "kube-api-access-clcwz") pod "e25bee7c-6b29-4d5d-856a-0174919c831c" (UID: "e25bee7c-6b29-4d5d-856a-0174919c831c"). InnerVolumeSpecName "kube-api-access-clcwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.009005 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clcwz\" (UniqueName: \"kubernetes.io/projected/e25bee7c-6b29-4d5d-856a-0174919c831c-kube-api-access-clcwz\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.009030 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e25bee7c-6b29-4d5d-856a-0174919c831c-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.009041 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds9xv\" (UniqueName: \"kubernetes.io/projected/328ef813-cbdd-4e37-901c-2c998a9d7edd-kube-api-access-ds9xv\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.009050 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/328ef813-cbdd-4e37-901c-2c998a9d7edd-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.173137 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-fk556"] Apr 06 12:17:26 crc kubenswrapper[4790]: E0406 12:17:26.174018 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="dnsmasq-dns" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174041 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="dnsmasq-dns" Apr 06 12:17:26 crc kubenswrapper[4790]: E0406 12:17:26.174058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="init" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174065 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="init" Apr 06 12:17:26 crc kubenswrapper[4790]: E0406 12:17:26.174079 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ef813-cbdd-4e37-901c-2c998a9d7edd" containerName="mariadb-account-create-update" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174085 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ef813-cbdd-4e37-901c-2c998a9d7edd" containerName="mariadb-account-create-update" Apr 06 12:17:26 crc kubenswrapper[4790]: E0406 12:17:26.174103 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25bee7c-6b29-4d5d-856a-0174919c831c" containerName="mariadb-database-create" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174110 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25bee7c-6b29-4d5d-856a-0174919c831c" containerName="mariadb-database-create" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174276 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25bee7c-6b29-4d5d-856a-0174919c831c" containerName="mariadb-database-create" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174305 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="328ef813-cbdd-4e37-901c-2c998a9d7edd" containerName="mariadb-account-create-update" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174327 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="290daa37-5ba2-47fc-841b-d89aed9745d2" containerName="dnsmasq-dns" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.174944 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.178398 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4xzjn" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.181524 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.181806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.187540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fk556"] Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.211037 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.216410 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.227169 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.266585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts\") pod \"3604131a-98c3-46f0-9add-fd3e919f18c1\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.267154 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3604131a-98c3-46f0-9add-fd3e919f18c1" (UID: "3604131a-98c3-46f0-9add-fd3e919f18c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.267829 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts\") pod \"ef90943c-ee17-4ba6-8047-93614a407a86\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.267974 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxb65\" (UniqueName: \"kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65\") pod \"ef90943c-ee17-4ba6-8047-93614a407a86\" (UID: \"ef90943c-ee17-4ba6-8047-93614a407a86\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.268881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef90943c-ee17-4ba6-8047-93614a407a86" (UID: "ef90943c-ee17-4ba6-8047-93614a407a86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.269023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vk7q\" (UniqueName: \"kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q\") pod \"3604131a-98c3-46f0-9add-fd3e919f18c1\" (UID: \"3604131a-98c3-46f0-9add-fd3e919f18c1\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.269230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b89r\" (UniqueName: \"kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r\") pod \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.269332 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts\") pod \"99e3e70d-1789-45a0-84f8-1423e049abf1\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.269494 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts\") pod \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\" (UID: \"f4c11a7e-145d-4f36-bb18-5ba87172fb2a\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.269800 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xc5\" (UniqueName: \"kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5\") pod \"99e3e70d-1789-45a0-84f8-1423e049abf1\" (UID: \"99e3e70d-1789-45a0-84f8-1423e049abf1\") " Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.270195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.270490 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.270655 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsx84\" (UniqueName: \"kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.270763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.271008 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3604131a-98c3-46f0-9add-fd3e919f18c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.271106 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef90943c-ee17-4ba6-8047-93614a407a86-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.271665 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99e3e70d-1789-45a0-84f8-1423e049abf1" (UID: "99e3e70d-1789-45a0-84f8-1423e049abf1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.273148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4c11a7e-145d-4f36-bb18-5ba87172fb2a" (UID: "f4c11a7e-145d-4f36-bb18-5ba87172fb2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.274045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r" (OuterVolumeSpecName: "kube-api-access-2b89r") pod "f4c11a7e-145d-4f36-bb18-5ba87172fb2a" (UID: "f4c11a7e-145d-4f36-bb18-5ba87172fb2a"). InnerVolumeSpecName "kube-api-access-2b89r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.275499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65" (OuterVolumeSpecName: "kube-api-access-qxb65") pod "ef90943c-ee17-4ba6-8047-93614a407a86" (UID: "ef90943c-ee17-4ba6-8047-93614a407a86"). InnerVolumeSpecName "kube-api-access-qxb65". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.275798 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5" (OuterVolumeSpecName: "kube-api-access-h6xc5") pod "99e3e70d-1789-45a0-84f8-1423e049abf1" (UID: "99e3e70d-1789-45a0-84f8-1423e049abf1"). InnerVolumeSpecName "kube-api-access-h6xc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.276552 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q" (OuterVolumeSpecName: "kube-api-access-4vk7q") pod "3604131a-98c3-46f0-9add-fd3e919f18c1" (UID: "3604131a-98c3-46f0-9add-fd3e919f18c1"). InnerVolumeSpecName "kube-api-access-4vk7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.372946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsx84\" (UniqueName: \"kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373135 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373192 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vk7q\" (UniqueName: \"kubernetes.io/projected/3604131a-98c3-46f0-9add-fd3e919f18c1-kube-api-access-4vk7q\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373202 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b89r\" (UniqueName: \"kubernetes.io/projected/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-kube-api-access-2b89r\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373211 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99e3e70d-1789-45a0-84f8-1423e049abf1-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373220 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4c11a7e-145d-4f36-bb18-5ba87172fb2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373229 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xc5\" (UniqueName: \"kubernetes.io/projected/99e3e70d-1789-45a0-84f8-1423e049abf1-kube-api-access-h6xc5\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.373238 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxb65\" (UniqueName: \"kubernetes.io/projected/ef90943c-ee17-4ba6-8047-93614a407a86-kube-api-access-qxb65\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.377053 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.377179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.377357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.390024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsx84\" (UniqueName: \"kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84\") pod \"glance-db-sync-fk556\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.538066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fk556" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.592406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l7zf9" event={"ID":"3604131a-98c3-46f0-9add-fd3e919f18c1","Type":"ContainerDied","Data":"da7998786f46c682051439227b924d74cd88c95213b82d134c0ad0df5781eead"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.592452 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da7998786f46c682051439227b924d74cd88c95213b82d134c0ad0df5781eead" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.592475 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l7zf9" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.596022 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fkm99" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.596342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fkm99" event={"ID":"e25bee7c-6b29-4d5d-856a-0174919c831c","Type":"ContainerDied","Data":"87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.596382 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87765e981dc9e5c3901a0ab980969badefb2afd17c978792d20b1fa1a53498cd" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.599035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aea6-account-create-update-vpgm4" event={"ID":"328ef813-cbdd-4e37-901c-2c998a9d7edd","Type":"ContainerDied","Data":"27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.599074 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e8306bb8e74bdfbb60e8f8eb6b187bc0150962f37e44b3feee1f0f2c6907c5" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.599156 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aea6-account-create-update-vpgm4" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.601624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-zlwz2" event={"ID":"99e3e70d-1789-45a0-84f8-1423e049abf1","Type":"ContainerDied","Data":"b0cd2c6fe5bc4915ea187cc4d9a8c14aa5aa3449c0b6dd136147b979421e1d95"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.601673 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0cd2c6fe5bc4915ea187cc4d9a8c14aa5aa3449c0b6dd136147b979421e1d95" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.601747 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-zlwz2" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.607065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerStarted","Data":"ec964921cdcb25527821621a3fea4ed90f35b3b18f0045efa3bff8a770378493"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.615150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ae49-account-create-update-nwkqx" event={"ID":"ef90943c-ee17-4ba6-8047-93614a407a86","Type":"ContainerDied","Data":"a7c5686dfdf298b5ae915f4e35e384386bd42ecd7bdd095e43abc0195ceb496d"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.615182 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c5686dfdf298b5ae915f4e35e384386bd42ecd7bdd095e43abc0195ceb496d" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.615227 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ae49-account-create-update-nwkqx" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.623115 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-42cd-account-create-update-9r9j2" event={"ID":"f4c11a7e-145d-4f36-bb18-5ba87172fb2a","Type":"ContainerDied","Data":"9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730"} Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.623170 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5ceda2d4de7b99fdee60ebd5dac732302324f1823569e68587b9e6de1bc730" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.623149 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-42cd-account-create-update-9r9j2" Apr 06 12:17:26 crc kubenswrapper[4790]: I0406 12:17:26.663934 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=9.844745016 podStartE2EDuration="41.663916648s" podCreationTimestamp="2026-04-06 12:16:45 +0000 UTC" firstStartedPulling="2026-04-06 12:16:53.932636966 +0000 UTC m=+1192.920379832" lastFinishedPulling="2026-04-06 12:17:25.751808598 +0000 UTC m=+1224.739551464" observedRunningTime="2026-04-06 12:17:26.650256176 +0000 UTC m=+1225.637999042" watchObservedRunningTime="2026-04-06 12:17:26.663916648 +0000 UTC m=+1225.651659514" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.111873 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-fk556"] Apr 06 12:17:27 crc kubenswrapper[4790]: W0406 12:17:27.115958 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod060385bd_e2c7_44f6_b575_3d353e949d85.slice/crio-f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285 WatchSource:0}: Error finding container f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285: Status 404 returned error can't find the container with id f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285 Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571004 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tpddh"] Apr 06 12:17:27 crc kubenswrapper[4790]: E0406 12:17:27.571368 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef90943c-ee17-4ba6-8047-93614a407a86" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571385 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef90943c-ee17-4ba6-8047-93614a407a86" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: E0406 12:17:27.571400 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3604131a-98c3-46f0-9add-fd3e919f18c1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571407 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3604131a-98c3-46f0-9add-fd3e919f18c1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: E0406 12:17:27.571416 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e3e70d-1789-45a0-84f8-1423e049abf1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571424 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e3e70d-1789-45a0-84f8-1423e049abf1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: E0406 12:17:27.571453 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c11a7e-145d-4f36-bb18-5ba87172fb2a" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571460 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c11a7e-145d-4f36-bb18-5ba87172fb2a" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571630 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3604131a-98c3-46f0-9add-fd3e919f18c1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571655 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef90943c-ee17-4ba6-8047-93614a407a86" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571665 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e3e70d-1789-45a0-84f8-1423e049abf1" containerName="mariadb-database-create" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.571675 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c11a7e-145d-4f36-bb18-5ba87172fb2a" containerName="mariadb-account-create-update" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.572245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.576715 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.579386 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tpddh"] Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.596468 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pjsg\" (UniqueName: \"kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.596513 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.631134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fk556" event={"ID":"060385bd-e2c7-44f6-b575-3d353e949d85","Type":"ContainerStarted","Data":"f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285"} Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.698222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pjsg\" (UniqueName: \"kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.698517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.699203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.720612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pjsg\" (UniqueName: \"kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg\") pod \"root-account-create-update-tpddh\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:27 crc kubenswrapper[4790]: I0406 12:17:27.890498 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:28 crc kubenswrapper[4790]: I0406 12:17:28.341873 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tpddh"] Apr 06 12:17:28 crc kubenswrapper[4790]: I0406 12:17:28.641543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tpddh" event={"ID":"ca973115-c1c5-4474-8de1-4650ec26f8a5","Type":"ContainerStarted","Data":"5169965d83e273e3915dad78fa6fb59167540e1e30650a2047213c2ac2a15aff"} Apr 06 12:17:28 crc kubenswrapper[4790]: I0406 12:17:28.641589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tpddh" event={"ID":"ca973115-c1c5-4474-8de1-4650ec26f8a5","Type":"ContainerStarted","Data":"bde45dce67c4444591a61f50242bc23b8d4e3b31bfd6666328c2d0d7e5c91ad5"} Apr 06 12:17:28 crc kubenswrapper[4790]: I0406 12:17:28.656244 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-tpddh" podStartSLOduration=1.65622312 podStartE2EDuration="1.65622312s" podCreationTimestamp="2026-04-06 12:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:28.655352447 +0000 UTC m=+1227.643095323" watchObservedRunningTime="2026-04-06 12:17:28.65622312 +0000 UTC m=+1227.643965986" Apr 06 12:17:29 crc kubenswrapper[4790]: I0406 12:17:29.651576 4790 generic.go:334] "Generic (PLEG): container finished" podID="ca973115-c1c5-4474-8de1-4650ec26f8a5" containerID="5169965d83e273e3915dad78fa6fb59167540e1e30650a2047213c2ac2a15aff" exitCode=0 Apr 06 12:17:29 crc kubenswrapper[4790]: I0406 12:17:29.651620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tpddh" event={"ID":"ca973115-c1c5-4474-8de1-4650ec26f8a5","Type":"ContainerDied","Data":"5169965d83e273e3915dad78fa6fb59167540e1e30650a2047213c2ac2a15aff"} Apr 06 12:17:29 crc kubenswrapper[4790]: I0406 12:17:29.932206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:29 crc kubenswrapper[4790]: E0406 12:17:29.932516 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Apr 06 12:17:29 crc kubenswrapper[4790]: E0406 12:17:29.932534 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Apr 06 12:17:29 crc kubenswrapper[4790]: E0406 12:17:29.932581 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift podName:3a39c210-842a-4286-8770-a84bbfec54a0 nodeName:}" failed. No retries permitted until 2026-04-06 12:17:45.932564206 +0000 UTC m=+1244.920307072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift") pod "swift-storage-0" (UID: "3a39c210-842a-4286-8770-a84bbfec54a0") : configmap "swift-ring-files" not found Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.595205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.595545 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.597736 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.659338 4790 generic.go:334] "Generic (PLEG): container finished" podID="8288b902-f791-4dce-b1c0-2afa8796712b" containerID="8bb43e3b032cf92c6ea6a72dd9fc8b59adb2c5a29f2567fe153e85044ab65a6c" exitCode=0 Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.660977 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6pwr" event={"ID":"8288b902-f791-4dce-b1c0-2afa8796712b","Type":"ContainerDied","Data":"8bb43e3b032cf92c6ea6a72dd9fc8b59adb2c5a29f2567fe153e85044ab65a6c"} Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.662055 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:30 crc kubenswrapper[4790]: I0406 12:17:30.982596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.082981 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pr4b9" podUID="51949d72-301c-4426-8397-273f6b2ecabd" containerName="ovn-controller" probeResult="failure" output=< Apr 06 12:17:31 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Apr 06 12:17:31 crc kubenswrapper[4790]: > Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.150661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts\") pod \"ca973115-c1c5-4474-8de1-4650ec26f8a5\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.150798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pjsg\" (UniqueName: \"kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg\") pod \"ca973115-c1c5-4474-8de1-4650ec26f8a5\" (UID: \"ca973115-c1c5-4474-8de1-4650ec26f8a5\") " Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.152312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca973115-c1c5-4474-8de1-4650ec26f8a5" (UID: "ca973115-c1c5-4474-8de1-4650ec26f8a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.165871 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg" (OuterVolumeSpecName: "kube-api-access-8pjsg") pod "ca973115-c1c5-4474-8de1-4650ec26f8a5" (UID: "ca973115-c1c5-4474-8de1-4650ec26f8a5"). InnerVolumeSpecName "kube-api-access-8pjsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.252448 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pjsg\" (UniqueName: \"kubernetes.io/projected/ca973115-c1c5-4474-8de1-4650ec26f8a5-kube-api-access-8pjsg\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.252487 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca973115-c1c5-4474-8de1-4650ec26f8a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.670010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tpddh" event={"ID":"ca973115-c1c5-4474-8de1-4650ec26f8a5","Type":"ContainerDied","Data":"bde45dce67c4444591a61f50242bc23b8d4e3b31bfd6666328c2d0d7e5c91ad5"} Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.670064 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bde45dce67c4444591a61f50242bc23b8d4e3b31bfd6666328c2d0d7e5c91ad5" Apr 06 12:17:31 crc kubenswrapper[4790]: I0406 12:17:31.670309 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tpddh" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.023364 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172556 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172635 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172657 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172784 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8vvc\" (UniqueName: \"kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.172913 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift\") pod \"8288b902-f791-4dce-b1c0-2afa8796712b\" (UID: \"8288b902-f791-4dce-b1c0-2afa8796712b\") " Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.173200 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.173503 4790 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.173899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.176816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc" (OuterVolumeSpecName: "kube-api-access-w8vvc") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "kube-api-access-w8vvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.188510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.194217 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts" (OuterVolumeSpecName: "scripts") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.197491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.211391 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8288b902-f791-4dce-b1c0-2afa8796712b" (UID: "8288b902-f791-4dce-b1c0-2afa8796712b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274657 4790 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-swiftconf\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274680 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8288b902-f791-4dce-b1c0-2afa8796712b-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274689 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8vvc\" (UniqueName: \"kubernetes.io/projected/8288b902-f791-4dce-b1c0-2afa8796712b-kube-api-access-w8vvc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274700 4790 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8288b902-f791-4dce-b1c0-2afa8796712b-etc-swift\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274708 4790 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-dispersionconf\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.274717 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8288b902-f791-4dce-b1c0-2afa8796712b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.679080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-x6pwr" event={"ID":"8288b902-f791-4dce-b1c0-2afa8796712b","Type":"ContainerDied","Data":"6200b3d07be36a19d48691c7988c4ecb333973d8e9867a25bdf8ffae5cb25539"} Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.679115 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6200b3d07be36a19d48691c7988c4ecb333973d8e9867a25bdf8ffae5cb25539" Apr 06 12:17:32 crc kubenswrapper[4790]: I0406 12:17:32.679134 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x6pwr" Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.298085 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.298421 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="prometheus" containerID="cri-o://908c302d97542aaf5486ee3f5060ce8e5037b5ff8223ddd5fa333db147c36488" gracePeriod=600 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.298558 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="config-reloader" containerID="cri-o://faf8af00f6013bd18261e1a18cc7b32f0417e5d789683599d2481fe91e885ced" gracePeriod=600 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.298603 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="thanos-sidecar" containerID="cri-o://ec964921cdcb25527821621a3fea4ed90f35b3b18f0045efa3bff8a770378493" gracePeriod=600 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.691594 4790 generic.go:334] "Generic (PLEG): container finished" podID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerID="ec964921cdcb25527821621a3fea4ed90f35b3b18f0045efa3bff8a770378493" exitCode=0 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.691995 4790 generic.go:334] "Generic (PLEG): container finished" podID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerID="faf8af00f6013bd18261e1a18cc7b32f0417e5d789683599d2481fe91e885ced" exitCode=0 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.692003 4790 generic.go:334] "Generic (PLEG): container finished" podID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerID="908c302d97542aaf5486ee3f5060ce8e5037b5ff8223ddd5fa333db147c36488" exitCode=0 Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.691698 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerDied","Data":"ec964921cdcb25527821621a3fea4ed90f35b3b18f0045efa3bff8a770378493"} Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.692054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerDied","Data":"faf8af00f6013bd18261e1a18cc7b32f0417e5d789683599d2481fe91e885ced"} Apr 06 12:17:33 crc kubenswrapper[4790]: I0406 12:17:33.692071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerDied","Data":"908c302d97542aaf5486ee3f5060ce8e5037b5ff8223ddd5fa333db147c36488"} Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.258169 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tpddh"] Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.268289 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tpddh"] Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.706290 4790 generic.go:334] "Generic (PLEG): container finished" podID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerID="89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d" exitCode=0 Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.706401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerDied","Data":"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d"} Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.709113 4790 generic.go:334] "Generic (PLEG): container finished" podID="586bc227-b2c5-4ead-88f4-fe18c5c28d41" containerID="d3886c30796f5336ed1952d510714c84938def6712de8559f323ea3f1672c8ab" exitCode=0 Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.709238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"586bc227-b2c5-4ead-88f4-fe18c5c28d41","Type":"ContainerDied","Data":"d3886c30796f5336ed1952d510714c84938def6712de8559f323ea3f1672c8ab"} Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.711821 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerDied","Data":"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4"} Apr 06 12:17:34 crc kubenswrapper[4790]: I0406 12:17:34.712748 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerID="78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4" exitCode=0 Apr 06 12:17:35 crc kubenswrapper[4790]: I0406 12:17:35.596222 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.131:9090/-/ready\": dial tcp 10.217.0.131:9090: connect: connection refused" Apr 06 12:17:35 crc kubenswrapper[4790]: I0406 12:17:35.686531 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca973115-c1c5-4474-8de1-4650ec26f8a5" path="/var/lib/kubelet/pods/ca973115-c1c5-4474-8de1-4650ec26f8a5/volumes" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.075270 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pr4b9" podUID="51949d72-301c-4426-8397-273f6b2ecabd" containerName="ovn-controller" probeResult="failure" output=< Apr 06 12:17:36 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Apr 06 12:17:36 crc kubenswrapper[4790]: > Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.103414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.111402 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4xbzr" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.325842 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pr4b9-config-fm5nc"] Apr 06 12:17:36 crc kubenswrapper[4790]: E0406 12:17:36.326251 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8288b902-f791-4dce-b1c0-2afa8796712b" containerName="swift-ring-rebalance" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.326268 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8288b902-f791-4dce-b1c0-2afa8796712b" containerName="swift-ring-rebalance" Apr 06 12:17:36 crc kubenswrapper[4790]: E0406 12:17:36.326286 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca973115-c1c5-4474-8de1-4650ec26f8a5" containerName="mariadb-account-create-update" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.326293 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca973115-c1c5-4474-8de1-4650ec26f8a5" containerName="mariadb-account-create-update" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.326474 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca973115-c1c5-4474-8de1-4650ec26f8a5" containerName="mariadb-account-create-update" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.326498 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8288b902-f791-4dce-b1c0-2afa8796712b" containerName="swift-ring-rebalance" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.327093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.329287 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.342051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pr4b9-config-fm5nc"] Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.453955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.454016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.454052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.454117 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.454147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqnd5\" (UniqueName: \"kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.454165 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.555754 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.555846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqnd5\" (UniqueName: \"kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.555903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.556201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.556247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.556276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.556958 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.557756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.558298 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.561585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.561668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.578188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqnd5\" (UniqueName: \"kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5\") pod \"ovn-controller-pr4b9-config-fm5nc\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.654409 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:36 crc kubenswrapper[4790]: I0406 12:17:36.860301 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.262158 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zddlm"] Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.263690 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.265645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.273825 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zddlm"] Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.414020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.414111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9nn\" (UniqueName: \"kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.515183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9nn\" (UniqueName: \"kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.515725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.516685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.544011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9nn\" (UniqueName: \"kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn\") pod \"root-account-create-update-zddlm\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.595603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.752973 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:17:39 crc kubenswrapper[4790]: I0406 12:17:39.753038 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.447352 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.531353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.531404 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532226 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532485 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrkwn\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532609 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.532641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1\") pod \"4080ebfd-9998-4f7d-803a-bf407d0adef0\" (UID: \"4080ebfd-9998-4f7d-803a-bf407d0adef0\") " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.533662 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.533760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.536435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.537047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.537673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config" (OuterVolumeSpecName: "config") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.539030 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.544403 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out" (OuterVolumeSpecName: "config-out") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.546328 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn" (OuterVolumeSpecName: "kube-api-access-rrkwn") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "kube-api-access-rrkwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.575237 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "pvc-428e608e-3b0f-419c-8722-244ca6b44799". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.583994 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config" (OuterVolumeSpecName: "web-config") pod "4080ebfd-9998-4f7d-803a-bf407d0adef0" (UID: "4080ebfd-9998-4f7d-803a-bf407d0adef0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634428 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrkwn\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-kube-api-access-rrkwn\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634459 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634470 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634482 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4080ebfd-9998-4f7d-803a-bf407d0adef0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634490 4790 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4080ebfd-9998-4f7d-803a-bf407d0adef0-tls-assets\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634500 4790 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634510 4790 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-web-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634518 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4080ebfd-9998-4f7d-803a-bf407d0adef0-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634542 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" " Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.634552 4790 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4080ebfd-9998-4f7d-803a-bf407d0adef0-config-out\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.650537 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zddlm"] Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.661099 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pr4b9-config-fm5nc"] Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.666974 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.667147 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-428e608e-3b0f-419c-8722-244ca6b44799" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799") on node "crc" Apr 06 12:17:40 crc kubenswrapper[4790]: W0406 12:17:40.672944 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f93ab4b_192e_49b1_a4fa_8da9a0ec95e3.slice/crio-b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3 WatchSource:0}: Error finding container b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3: Status 404 returned error can't find the container with id b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3 Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.736653 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.776594 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pr4b9-config-fm5nc" event={"ID":"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c","Type":"ContainerStarted","Data":"f65d347472b3e2fd865aa74acaa294f5b9b48b4220a41995f4896f0b6ed2f2aa"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.780845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerStarted","Data":"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.781106 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.784288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zddlm" event={"ID":"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3","Type":"ContainerStarted","Data":"b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.786787 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/notifications-rabbitmq-server-0" event={"ID":"586bc227-b2c5-4ead-88f4-fe18c5c28d41","Type":"ContainerStarted","Data":"f46a33f957ed55bbc31ba3654a4136db21fac9ecdba34f3462df84342eb63066"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.787159 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.794168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4080ebfd-9998-4f7d-803a-bf407d0adef0","Type":"ContainerDied","Data":"cdf63db51a6bd98baad2baa2d27d5429f9e4d2ae6c3b4ec281f826c9d5cb462a"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.794229 4790 scope.go:117] "RemoveContainer" containerID="ec964921cdcb25527821621a3fea4ed90f35b3b18f0045efa3bff8a770378493" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.794415 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.802546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerStarted","Data":"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb"} Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.803554 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.806668 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.19058668 podStartE2EDuration="1m5.806651362s" podCreationTimestamp="2026-04-06 12:16:35 +0000 UTC" firstStartedPulling="2026-04-06 12:16:51.900381025 +0000 UTC m=+1190.888123891" lastFinishedPulling="2026-04-06 12:16:59.516445697 +0000 UTC m=+1198.504188573" observedRunningTime="2026-04-06 12:17:40.803696425 +0000 UTC m=+1239.791439291" watchObservedRunningTime="2026-04-06 12:17:40.806651362 +0000 UTC m=+1239.794394228" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.836410 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=57.672907709 podStartE2EDuration="1m4.836390755s" podCreationTimestamp="2026-04-06 12:16:36 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.352221662 +0000 UTC m=+1191.339964538" lastFinishedPulling="2026-04-06 12:16:59.515704708 +0000 UTC m=+1198.503447584" observedRunningTime="2026-04-06 12:17:40.82592339 +0000 UTC m=+1239.813666246" watchObservedRunningTime="2026-04-06 12:17:40.836390755 +0000 UTC m=+1239.824133621" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.841911 4790 scope.go:117] "RemoveContainer" containerID="faf8af00f6013bd18261e1a18cc7b32f0417e5d789683599d2481fe91e885ced" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.859235 4790 scope.go:117] "RemoveContainer" containerID="908c302d97542aaf5486ee3f5060ce8e5037b5ff8223ddd5fa333db147c36488" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.862414 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/notifications-rabbitmq-server-0" podStartSLOduration=57.961329716 podStartE2EDuration="1m4.862392999s" podCreationTimestamp="2026-04-06 12:16:36 +0000 UTC" firstStartedPulling="2026-04-06 12:16:52.42663854 +0000 UTC m=+1191.414381416" lastFinishedPulling="2026-04-06 12:16:59.327701833 +0000 UTC m=+1198.315444699" observedRunningTime="2026-04-06 12:17:40.856018652 +0000 UTC m=+1239.843761538" watchObservedRunningTime="2026-04-06 12:17:40.862392999 +0000 UTC m=+1239.850135865" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.882334 4790 scope.go:117] "RemoveContainer" containerID="b76c9ae59236ea315c6cccadf1328f37ec5447b747f22a2a7693f56a1f29a416" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.893738 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.913129 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.929334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:40 crc kubenswrapper[4790]: E0406 12:17:40.929815 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="thanos-sidecar" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.929906 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="thanos-sidecar" Apr 06 12:17:40 crc kubenswrapper[4790]: E0406 12:17:40.929970 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="init-config-reloader" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930032 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="init-config-reloader" Apr 06 12:17:40 crc kubenswrapper[4790]: E0406 12:17:40.930094 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="prometheus" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930146 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="prometheus" Apr 06 12:17:40 crc kubenswrapper[4790]: E0406 12:17:40.930200 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="config-reloader" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930248 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="config-reloader" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930471 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="config-reloader" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930536 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="thanos-sidecar" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.930707 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" containerName="prometheus" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.932218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.937668 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.937701 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.937981 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.938028 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.938127 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.938246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.938328 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gtk7n" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.938387 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.953720 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Apr 06 12:17:40 crc kubenswrapper[4790]: I0406 12:17:40.963358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044015 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044096 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvsp\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044197 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044224 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044274 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.044341 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.079326 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pr4b9" podUID="51949d72-301c-4426-8397-273f6b2ecabd" containerName="ovn-controller" probeResult="failure" output=< Apr 06 12:17:41 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Apr 06 12:17:41 crc kubenswrapper[4790]: > Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145770 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.145970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvsp\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146151 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.146853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.147389 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.150597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.151155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.151187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.151218 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.151821 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.152812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.153657 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.153776 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2657789c2cd73a331030693f07b47cf0a3bc578270043bd3878090c8357096a6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.153899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.155321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.165535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvsp\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.192595 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.250970 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.716374 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4080ebfd-9998-4f7d-803a-bf407d0adef0" path="/var/lib/kubelet/pods/4080ebfd-9998-4f7d-803a-bf407d0adef0/volumes" Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.811693 4790 generic.go:334] "Generic (PLEG): container finished" podID="82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" containerID="81621dd202759600beb132e9f91a12681c557b57a6805803182b32c7d7ce37a6" exitCode=0 Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.811993 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pr4b9-config-fm5nc" event={"ID":"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c","Type":"ContainerDied","Data":"81621dd202759600beb132e9f91a12681c557b57a6805803182b32c7d7ce37a6"} Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.813537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fk556" event={"ID":"060385bd-e2c7-44f6-b575-3d353e949d85","Type":"ContainerStarted","Data":"0b97254baba42cea65cd23e9e21110d1eb6b67d0e186093af5fbe57e0de17c2b"} Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.816243 4790 generic.go:334] "Generic (PLEG): container finished" podID="0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" containerID="ef9b6ef49dafe6fcec6a49f5ad2034f91b59c5620dce412a00768b3463a443fd" exitCode=0 Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.816319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zddlm" event={"ID":"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3","Type":"ContainerDied","Data":"ef9b6ef49dafe6fcec6a49f5ad2034f91b59c5620dce412a00768b3463a443fd"} Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.829857 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:17:41 crc kubenswrapper[4790]: I0406 12:17:41.908058 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-fk556" podStartSLOduration=2.834281744 podStartE2EDuration="15.90804042s" podCreationTimestamp="2026-04-06 12:17:26 +0000 UTC" firstStartedPulling="2026-04-06 12:17:27.117632433 +0000 UTC m=+1226.105375299" lastFinishedPulling="2026-04-06 12:17:40.191391119 +0000 UTC m=+1239.179133975" observedRunningTime="2026-04-06 12:17:41.895680845 +0000 UTC m=+1240.883423711" watchObservedRunningTime="2026-04-06 12:17:41.90804042 +0000 UTC m=+1240.895783276" Apr 06 12:17:42 crc kubenswrapper[4790]: I0406 12:17:42.841705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerStarted","Data":"69abd3c017a97ef49f302010d66b1dd8da09d3099bc6508437c7eea9d01b8c3f"} Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.378959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.387527 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491237 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9nn\" (UniqueName: \"kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn\") pod \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491384 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491421 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491466 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491498 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491515 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqnd5\" (UniqueName: \"kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5\") pod \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\" (UID: \"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491562 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts\") pod \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\" (UID: \"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3\") " Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491743 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.491811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492148 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492167 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492191 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run" (OuterVolumeSpecName: "var-run") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492622 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" (UID: "0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492663 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.492903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts" (OuterVolumeSpecName: "scripts") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.498899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn" (OuterVolumeSpecName: "kube-api-access-dn9nn") pod "0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" (UID: "0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3"). InnerVolumeSpecName "kube-api-access-dn9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.498972 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5" (OuterVolumeSpecName: "kube-api-access-gqnd5") pod "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" (UID: "82fae3c2-35ca-4f2d-ba5d-77eae4c7060c"). InnerVolumeSpecName "kube-api-access-gqnd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594234 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-var-run\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594272 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-additional-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594284 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594294 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqnd5\" (UniqueName: \"kubernetes.io/projected/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c-kube-api-access-gqnd5\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594306 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.594315 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn9nn\" (UniqueName: \"kubernetes.io/projected/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3-kube-api-access-dn9nn\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.863219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zddlm" event={"ID":"0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3","Type":"ContainerDied","Data":"b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3"} Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.863575 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b02f1cc2056fb8e2968ff4cf88f535df275e5676c26374c4ea2f5e7b807320a3" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.863244 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zddlm" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.865089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pr4b9-config-fm5nc" event={"ID":"82fae3c2-35ca-4f2d-ba5d-77eae4c7060c","Type":"ContainerDied","Data":"f65d347472b3e2fd865aa74acaa294f5b9b48b4220a41995f4896f0b6ed2f2aa"} Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.865109 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65d347472b3e2fd865aa74acaa294f5b9b48b4220a41995f4896f0b6ed2f2aa" Apr 06 12:17:43 crc kubenswrapper[4790]: I0406 12:17:43.865147 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pr4b9-config-fm5nc" Apr 06 12:17:44 crc kubenswrapper[4790]: I0406 12:17:44.511648 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pr4b9-config-fm5nc"] Apr 06 12:17:44 crc kubenswrapper[4790]: I0406 12:17:44.520031 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pr4b9-config-fm5nc"] Apr 06 12:17:44 crc kubenswrapper[4790]: I0406 12:17:44.874690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerStarted","Data":"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512"} Apr 06 12:17:45 crc kubenswrapper[4790]: I0406 12:17:45.691401 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" path="/var/lib/kubelet/pods/82fae3c2-35ca-4f2d-ba5d-77eae4c7060c/volumes" Apr 06 12:17:45 crc kubenswrapper[4790]: I0406 12:17:45.934690 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:45 crc kubenswrapper[4790]: I0406 12:17:45.957586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3a39c210-842a-4286-8770-a84bbfec54a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3a39c210-842a-4286-8770-a84bbfec54a0\") " pod="openstack/swift-storage-0" Apr 06 12:17:45 crc kubenswrapper[4790]: I0406 12:17:45.961191 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Apr 06 12:17:46 crc kubenswrapper[4790]: I0406 12:17:46.111254 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-pr4b9" Apr 06 12:17:46 crc kubenswrapper[4790]: I0406 12:17:46.581281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Apr 06 12:17:46 crc kubenswrapper[4790]: I0406 12:17:46.894068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"0dea3e15dc28e55cb72ae7e91e95c6903b39fbf7e9161be3dd0eee664dd64e4e"} Apr 06 12:17:47 crc kubenswrapper[4790]: I0406 12:17:47.905897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"73915888bfc8fc764f45db5df6c89fec4c5c5bacd41dfcc3bb434ebf13048026"} Apr 06 12:17:47 crc kubenswrapper[4790]: I0406 12:17:47.906913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"12294d7abea6ded76ad563305d27f3ef46fec3c0ff23db9a3d9242b126ed93a0"} Apr 06 12:17:47 crc kubenswrapper[4790]: I0406 12:17:47.906931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"325ad410cae4e459567d0e5798a426178f31323a470748bb5fe679a3d14031b3"} Apr 06 12:17:48 crc kubenswrapper[4790]: I0406 12:17:48.928838 4790 generic.go:334] "Generic (PLEG): container finished" podID="060385bd-e2c7-44f6-b575-3d353e949d85" containerID="0b97254baba42cea65cd23e9e21110d1eb6b67d0e186093af5fbe57e0de17c2b" exitCode=0 Apr 06 12:17:48 crc kubenswrapper[4790]: I0406 12:17:48.929044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fk556" event={"ID":"060385bd-e2c7-44f6-b575-3d353e949d85","Type":"ContainerDied","Data":"0b97254baba42cea65cd23e9e21110d1eb6b67d0e186093af5fbe57e0de17c2b"} Apr 06 12:17:48 crc kubenswrapper[4790]: I0406 12:17:48.939887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"2a33fd02c2b6f0fc28332cec4a65d243649a738cbc291fa3bdc6e35885984c7b"} Apr 06 12:17:48 crc kubenswrapper[4790]: I0406 12:17:48.939930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"e2631a992fba1c6ba7c28b3a4c0d91c438a48450dc8b1ddca28f9b7cbaea0be1"} Apr 06 12:17:48 crc kubenswrapper[4790]: I0406 12:17:48.939941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"0f7f3cbc011977817bc45eeb6fae906f3c48abae48ce92347bb77b49f47c5065"} Apr 06 12:17:49 crc kubenswrapper[4790]: I0406 12:17:49.967161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"284c8396730c889f0859e58311871dbfaa8b1823ca5059106493160853a77326"} Apr 06 12:17:49 crc kubenswrapper[4790]: I0406 12:17:49.967819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"3e92c31dd20672d2269e09a16ecc03448cf685a297e9ab2f54fe451c1c9123d8"} Apr 06 12:17:49 crc kubenswrapper[4790]: I0406 12:17:49.967873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"5f3af2c58abda1ab4bf7999931d6d5056ebf2fff03c6258687306231e06ffc42"} Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.510998 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fk556" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.628296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data\") pod \"060385bd-e2c7-44f6-b575-3d353e949d85\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.628685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle\") pod \"060385bd-e2c7-44f6-b575-3d353e949d85\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.628798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsx84\" (UniqueName: \"kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84\") pod \"060385bd-e2c7-44f6-b575-3d353e949d85\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.628864 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data\") pod \"060385bd-e2c7-44f6-b575-3d353e949d85\" (UID: \"060385bd-e2c7-44f6-b575-3d353e949d85\") " Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.632765 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "060385bd-e2c7-44f6-b575-3d353e949d85" (UID: "060385bd-e2c7-44f6-b575-3d353e949d85"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.633289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84" (OuterVolumeSpecName: "kube-api-access-qsx84") pod "060385bd-e2c7-44f6-b575-3d353e949d85" (UID: "060385bd-e2c7-44f6-b575-3d353e949d85"). InnerVolumeSpecName "kube-api-access-qsx84". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.653136 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "060385bd-e2c7-44f6-b575-3d353e949d85" (UID: "060385bd-e2c7-44f6-b575-3d353e949d85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.675545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data" (OuterVolumeSpecName: "config-data") pod "060385bd-e2c7-44f6-b575-3d353e949d85" (UID: "060385bd-e2c7-44f6-b575-3d353e949d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.731070 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsx84\" (UniqueName: \"kubernetes.io/projected/060385bd-e2c7-44f6-b575-3d353e949d85-kube-api-access-qsx84\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.731103 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.731113 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.731121 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060385bd-e2c7-44f6-b575-3d353e949d85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.990415 4790 generic.go:334] "Generic (PLEG): container finished" podID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerID="d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512" exitCode=0 Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.990459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerDied","Data":"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512"} Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.994362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-fk556" event={"ID":"060385bd-e2c7-44f6-b575-3d353e949d85","Type":"ContainerDied","Data":"f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285"} Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.994399 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9f35f338dda61bd2b042e4891199af9785c023efba313f89635788726665285" Apr 06 12:17:50 crc kubenswrapper[4790]: I0406 12:17:50.994461 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-fk556" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.010571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"63278d7e0770879f5048fa0dd0a6b7533e02bcdaeb991f7457f1b5b46ddf87b7"} Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.010629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"64c3481d87c153b8fdc8f077e0271818e477bc12def5a77b1b9e7ae5fd8597c1"} Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.010640 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"2bfb1f286fdf890cd616b6f26133be04b58f85fe0c95eea6ddd11b3d68697d74"} Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.010648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"eec0d38b25444de81bc2718373ed17aeec637d334b6518d69f7c84cdeaede637"} Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.010657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"07d303a8d5366874911a11dcd901820e47212c8aedfc90efd49d4ea8d853a58d"} Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.400457 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:51 crc kubenswrapper[4790]: E0406 12:17:51.402664 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" containerName="mariadb-account-create-update" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.402699 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" containerName="mariadb-account-create-update" Apr 06 12:17:51 crc kubenswrapper[4790]: E0406 12:17:51.402720 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060385bd-e2c7-44f6-b575-3d353e949d85" containerName="glance-db-sync" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.402728 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="060385bd-e2c7-44f6-b575-3d353e949d85" containerName="glance-db-sync" Apr 06 12:17:51 crc kubenswrapper[4790]: E0406 12:17:51.402743 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" containerName="ovn-config" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.402752 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" containerName="ovn-config" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.403008 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="82fae3c2-35ca-4f2d-ba5d-77eae4c7060c" containerName="ovn-config" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.403035 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="060385bd-e2c7-44f6-b575-3d353e949d85" containerName="glance-db-sync" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.403050 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" containerName="mariadb-account-create-update" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.408100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.479068 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.545240 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.545312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.545353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.545442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.545478 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwft6\" (UniqueName: \"kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.647241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.647311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwft6\" (UniqueName: \"kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.647372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.647420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.647466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.648547 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.648561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.648563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.648736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.674340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwft6\" (UniqueName: \"kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6\") pod \"dnsmasq-dns-84684f697-vp2sp\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:51 crc kubenswrapper[4790]: I0406 12:17:51.731225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:52 crc kubenswrapper[4790]: W0406 12:17:52.015404 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea80221_d3ba_4f43_8a23_532584214384.slice/crio-43d7bd65e0e369dfcd500fe8caa747e2aa6f4c85bc74aab341891cfbb881c9bc WatchSource:0}: Error finding container 43d7bd65e0e369dfcd500fe8caa747e2aa6f4c85bc74aab341891cfbb881c9bc: Status 404 returned error can't find the container with id 43d7bd65e0e369dfcd500fe8caa747e2aa6f4c85bc74aab341891cfbb881c9bc Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.021119 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.023962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3a39c210-842a-4286-8770-a84bbfec54a0","Type":"ContainerStarted","Data":"b82bdbb1afdf282b7e3828bb75d3eaeb47ef687231a1058c26e360aaca5989c2"} Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.026454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerStarted","Data":"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147"} Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.065914 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.948382863 podStartE2EDuration="40.065872361s" podCreationTimestamp="2026-04-06 12:17:12 +0000 UTC" firstStartedPulling="2026-04-06 12:17:46.594603835 +0000 UTC m=+1245.582346701" lastFinishedPulling="2026-04-06 12:17:49.712093333 +0000 UTC m=+1248.699836199" observedRunningTime="2026-04-06 12:17:52.057927072 +0000 UTC m=+1251.045669938" watchObservedRunningTime="2026-04-06 12:17:52.065872361 +0000 UTC m=+1251.053615227" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.419040 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.475616 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.477068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.480065 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.486762 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564521 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv5x2\" (UniqueName: \"kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.564932 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.666880 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.666957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv5x2\" (UniqueName: \"kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.666991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.667036 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.667073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.667112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.668031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.668126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.668171 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.668221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.668306 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.738629 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv5x2\" (UniqueName: \"kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2\") pod \"dnsmasq-dns-ccfc598b5-7ftkm\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:52 crc kubenswrapper[4790]: I0406 12:17:52.809980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:53 crc kubenswrapper[4790]: I0406 12:17:53.039476 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea80221-d3ba-4f43-8a23-532584214384" containerID="d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4" exitCode=0 Apr 06 12:17:53 crc kubenswrapper[4790]: I0406 12:17:53.041519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84684f697-vp2sp" event={"ID":"aea80221-d3ba-4f43-8a23-532584214384","Type":"ContainerDied","Data":"d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4"} Apr 06 12:17:53 crc kubenswrapper[4790]: I0406 12:17:53.041546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84684f697-vp2sp" event={"ID":"aea80221-d3ba-4f43-8a23-532584214384","Type":"ContainerStarted","Data":"43d7bd65e0e369dfcd500fe8caa747e2aa6f4c85bc74aab341891cfbb881c9bc"} Apr 06 12:17:53 crc kubenswrapper[4790]: I0406 12:17:53.404168 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:17:53 crc kubenswrapper[4790]: W0406 12:17:53.551265 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa1d242d_8c20_4abe_a611_a6ab2e4b1021.slice/crio-dda4b6d404d91ae43f62b568466cef8a4bef351a089855786209f4ca7aedd13a WatchSource:0}: Error finding container dda4b6d404d91ae43f62b568466cef8a4bef351a089855786209f4ca7aedd13a: Status 404 returned error can't find the container with id dda4b6d404d91ae43f62b568466cef8a4bef351a089855786209f4ca7aedd13a Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.051209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerStarted","Data":"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae"} Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.051671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerStarted","Data":"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40"} Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.053902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84684f697-vp2sp" event={"ID":"aea80221-d3ba-4f43-8a23-532584214384","Type":"ContainerStarted","Data":"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519"} Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.053976 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.053961 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84684f697-vp2sp" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="dnsmasq-dns" containerID="cri-o://27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519" gracePeriod=10 Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.056118 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerID="c6344137b881df201d9237ec232e7142aecdf0bffea3a8ecdfaa3d07d35b4620" exitCode=0 Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.056146 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" event={"ID":"aa1d242d-8c20-4abe-a611-a6ab2e4b1021","Type":"ContainerDied","Data":"c6344137b881df201d9237ec232e7142aecdf0bffea3a8ecdfaa3d07d35b4620"} Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.056165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" event={"ID":"aa1d242d-8c20-4abe-a611-a6ab2e4b1021","Type":"ContainerStarted","Data":"dda4b6d404d91ae43f62b568466cef8a4bef351a089855786209f4ca7aedd13a"} Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.088335 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.088319649 podStartE2EDuration="14.088319649s" podCreationTimestamp="2026-04-06 12:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:54.087239001 +0000 UTC m=+1253.074981867" watchObservedRunningTime="2026-04-06 12:17:54.088319649 +0000 UTC m=+1253.076062515" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.127468 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84684f697-vp2sp" podStartSLOduration=3.127446309 podStartE2EDuration="3.127446309s" podCreationTimestamp="2026-04-06 12:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:54.119100729 +0000 UTC m=+1253.106843595" watchObservedRunningTime="2026-04-06 12:17:54.127446309 +0000 UTC m=+1253.115189175" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.557184 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.618666 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb\") pod \"aea80221-d3ba-4f43-8a23-532584214384\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.618721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc\") pod \"aea80221-d3ba-4f43-8a23-532584214384\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.618817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb\") pod \"aea80221-d3ba-4f43-8a23-532584214384\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.618902 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwft6\" (UniqueName: \"kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6\") pod \"aea80221-d3ba-4f43-8a23-532584214384\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.618957 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config\") pod \"aea80221-d3ba-4f43-8a23-532584214384\" (UID: \"aea80221-d3ba-4f43-8a23-532584214384\") " Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.637083 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6" (OuterVolumeSpecName: "kube-api-access-cwft6") pod "aea80221-d3ba-4f43-8a23-532584214384" (UID: "aea80221-d3ba-4f43-8a23-532584214384"). InnerVolumeSpecName "kube-api-access-cwft6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.682904 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config" (OuterVolumeSpecName: "config") pod "aea80221-d3ba-4f43-8a23-532584214384" (UID: "aea80221-d3ba-4f43-8a23-532584214384"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.700614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aea80221-d3ba-4f43-8a23-532584214384" (UID: "aea80221-d3ba-4f43-8a23-532584214384"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.703758 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aea80221-d3ba-4f43-8a23-532584214384" (UID: "aea80221-d3ba-4f43-8a23-532584214384"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.707507 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aea80221-d3ba-4f43-8a23-532584214384" (UID: "aea80221-d3ba-4f43-8a23-532584214384"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.720584 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.720624 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.720638 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwft6\" (UniqueName: \"kubernetes.io/projected/aea80221-d3ba-4f43-8a23-532584214384-kube-api-access-cwft6\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.720652 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:54 crc kubenswrapper[4790]: I0406 12:17:54.720662 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aea80221-d3ba-4f43-8a23-532584214384-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.066229 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" event={"ID":"aa1d242d-8c20-4abe-a611-a6ab2e4b1021","Type":"ContainerStarted","Data":"636c57740cffa70333e10685c6600e863f7313a4fa22aaa7ffe80356cccfe54c"} Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.066392 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.067959 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea80221-d3ba-4f43-8a23-532584214384" containerID="27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519" exitCode=0 Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.068018 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84684f697-vp2sp" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.068072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84684f697-vp2sp" event={"ID":"aea80221-d3ba-4f43-8a23-532584214384","Type":"ContainerDied","Data":"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519"} Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.068089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84684f697-vp2sp" event={"ID":"aea80221-d3ba-4f43-8a23-532584214384","Type":"ContainerDied","Data":"43d7bd65e0e369dfcd500fe8caa747e2aa6f4c85bc74aab341891cfbb881c9bc"} Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.068106 4790 scope.go:117] "RemoveContainer" containerID="27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.096063 4790 scope.go:117] "RemoveContainer" containerID="d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.124428 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" podStartSLOduration=3.124411538 podStartE2EDuration="3.124411538s" podCreationTimestamp="2026-04-06 12:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:17:55.119339715 +0000 UTC m=+1254.107082581" watchObservedRunningTime="2026-04-06 12:17:55.124411538 +0000 UTC m=+1254.112154404" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.135130 4790 scope.go:117] "RemoveContainer" containerID="27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519" Apr 06 12:17:55 crc kubenswrapper[4790]: E0406 12:17:55.136578 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519\": container with ID starting with 27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519 not found: ID does not exist" containerID="27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.136611 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519"} err="failed to get container status \"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519\": rpc error: code = NotFound desc = could not find container \"27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519\": container with ID starting with 27975928fd18d434c52bce41e5ae6dbd6a4f4e4c96a76cdd84312027d9b42519 not found: ID does not exist" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.136633 4790 scope.go:117] "RemoveContainer" containerID="d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4" Apr 06 12:17:55 crc kubenswrapper[4790]: E0406 12:17:55.140023 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4\": container with ID starting with d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4 not found: ID does not exist" containerID="d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.140085 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4"} err="failed to get container status \"d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4\": rpc error: code = NotFound desc = could not find container \"d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4\": container with ID starting with d650e52e1dfb740b2e9740519aeac06879f86b4db4a80dda854e7ce4e17cb4e4 not found: ID does not exist" Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.156524 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.178534 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84684f697-vp2sp"] Apr 06 12:17:55 crc kubenswrapper[4790]: I0406 12:17:55.685212 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea80221-d3ba-4f43-8a23-532584214384" path="/var/lib/kubelet/pods/aea80221-d3ba-4f43-8a23-532584214384/volumes" Apr 06 12:17:56 crc kubenswrapper[4790]: I0406 12:17:56.251265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:56 crc kubenswrapper[4790]: I0406 12:17:56.252933 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:56 crc kubenswrapper[4790]: I0406 12:17:56.257717 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:57 crc kubenswrapper[4790]: I0406 12:17:57.096732 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Apr 06 12:17:57 crc kubenswrapper[4790]: I0406 12:17:57.121759 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.123:5671: connect: connection refused" Apr 06 12:17:57 crc kubenswrapper[4790]: I0406 12:17:57.439821 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.124:5671: connect: connection refused" Apr 06 12:17:57 crc kubenswrapper[4790]: I0406 12:17:57.796387 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/notifications-rabbitmq-server-0" podUID="586bc227-b2c5-4ead-88f4-fe18c5c28d41" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.125:5671: connect: connection refused" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.161709 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591298-z2tn8"] Apr 06 12:18:00 crc kubenswrapper[4790]: E0406 12:18:00.162406 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="init" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.162423 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="init" Apr 06 12:18:00 crc kubenswrapper[4790]: E0406 12:18:00.162463 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="dnsmasq-dns" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.162474 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="dnsmasq-dns" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.162723 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea80221-d3ba-4f43-8a23-532584214384" containerName="dnsmasq-dns" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.163519 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.166021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.166340 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.166624 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.171230 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591298-z2tn8"] Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.216692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmkr\" (UniqueName: \"kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr\") pod \"auto-csr-approver-29591298-z2tn8\" (UID: \"4acf1b39-821a-435f-9943-d6f12210cc0a\") " pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.318507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmkr\" (UniqueName: \"kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr\") pod \"auto-csr-approver-29591298-z2tn8\" (UID: \"4acf1b39-821a-435f-9943-d6f12210cc0a\") " pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.343736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmkr\" (UniqueName: \"kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr\") pod \"auto-csr-approver-29591298-z2tn8\" (UID: \"4acf1b39-821a-435f-9943-d6f12210cc0a\") " pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.485444 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:00 crc kubenswrapper[4790]: I0406 12:18:00.931065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591298-z2tn8"] Apr 06 12:18:00 crc kubenswrapper[4790]: W0406 12:18:00.931198 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4acf1b39_821a_435f_9943_d6f12210cc0a.slice/crio-b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c WatchSource:0}: Error finding container b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c: Status 404 returned error can't find the container with id b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c Apr 06 12:18:01 crc kubenswrapper[4790]: I0406 12:18:01.125120 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" event={"ID":"4acf1b39-821a-435f-9943-d6f12210cc0a","Type":"ContainerStarted","Data":"b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c"} Apr 06 12:18:02 crc kubenswrapper[4790]: I0406 12:18:02.811733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:18:02 crc kubenswrapper[4790]: I0406 12:18:02.927093 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:18:02 crc kubenswrapper[4790]: I0406 12:18:02.927315 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="dnsmasq-dns" containerID="cri-o://4dab267bc571528b186b4b0804889da2f9db40186527f7a43ba51afcd2709a0f" gracePeriod=10 Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.167442 4790 generic.go:334] "Generic (PLEG): container finished" podID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerID="4dab267bc571528b186b4b0804889da2f9db40186527f7a43ba51afcd2709a0f" exitCode=0 Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.167506 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" event={"ID":"5c17f148-f3a8-4e44-b9a0-ea95774a05f1","Type":"ContainerDied","Data":"4dab267bc571528b186b4b0804889da2f9db40186527f7a43ba51afcd2709a0f"} Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.177592 4790 generic.go:334] "Generic (PLEG): container finished" podID="4acf1b39-821a-435f-9943-d6f12210cc0a" containerID="86c7a01185401a5d1fb88cae44bb2c787903cba75db32d75487e4db5db97a2aa" exitCode=0 Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.177678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" event={"ID":"4acf1b39-821a-435f-9943-d6f12210cc0a","Type":"ContainerDied","Data":"86c7a01185401a5d1fb88cae44bb2c787903cba75db32d75487e4db5db97a2aa"} Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.381117 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.477955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb\") pod \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.478096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqw8f\" (UniqueName: \"kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f\") pod \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.478137 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc\") pod \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.478179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb\") pod \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.478210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config\") pod \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\" (UID: \"5c17f148-f3a8-4e44-b9a0-ea95774a05f1\") " Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.485051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f" (OuterVolumeSpecName: "kube-api-access-vqw8f") pod "5c17f148-f3a8-4e44-b9a0-ea95774a05f1" (UID: "5c17f148-f3a8-4e44-b9a0-ea95774a05f1"). InnerVolumeSpecName "kube-api-access-vqw8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.520155 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c17f148-f3a8-4e44-b9a0-ea95774a05f1" (UID: "5c17f148-f3a8-4e44-b9a0-ea95774a05f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.522720 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config" (OuterVolumeSpecName: "config") pod "5c17f148-f3a8-4e44-b9a0-ea95774a05f1" (UID: "5c17f148-f3a8-4e44-b9a0-ea95774a05f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.522933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c17f148-f3a8-4e44-b9a0-ea95774a05f1" (UID: "5c17f148-f3a8-4e44-b9a0-ea95774a05f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.524114 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c17f148-f3a8-4e44-b9a0-ea95774a05f1" (UID: "5c17f148-f3a8-4e44-b9a0-ea95774a05f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.579906 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.580168 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqw8f\" (UniqueName: \"kubernetes.io/projected/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-kube-api-access-vqw8f\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.580232 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.580321 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:03 crc kubenswrapper[4790]: I0406 12:18:03.580377 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c17f148-f3a8-4e44-b9a0-ea95774a05f1-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.187374 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.187398 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" event={"ID":"5c17f148-f3a8-4e44-b9a0-ea95774a05f1","Type":"ContainerDied","Data":"6f7aaa2be1d4d445ce70daf6a5be39057c593681e7b519b9a662a2cb03e18104"} Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.187475 4790 scope.go:117] "RemoveContainer" containerID="4dab267bc571528b186b4b0804889da2f9db40186527f7a43ba51afcd2709a0f" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.215713 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.223872 4790 scope.go:117] "RemoveContainer" containerID="60391508f559b446e6a82980edd2d11946bf8c76e11164d97fb3336cf9cbb67a" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.228410 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-756fdd77c5-2qxts"] Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.525571 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.595752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmkr\" (UniqueName: \"kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr\") pod \"4acf1b39-821a-435f-9943-d6f12210cc0a\" (UID: \"4acf1b39-821a-435f-9943-d6f12210cc0a\") " Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.600433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr" (OuterVolumeSpecName: "kube-api-access-jwmkr") pod "4acf1b39-821a-435f-9943-d6f12210cc0a" (UID: "4acf1b39-821a-435f-9943-d6f12210cc0a"). InnerVolumeSpecName "kube-api-access-jwmkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:04 crc kubenswrapper[4790]: I0406 12:18:04.698390 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwmkr\" (UniqueName: \"kubernetes.io/projected/4acf1b39-821a-435f-9943-d6f12210cc0a-kube-api-access-jwmkr\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.208558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" event={"ID":"4acf1b39-821a-435f-9943-d6f12210cc0a","Type":"ContainerDied","Data":"b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c"} Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.208847 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b067359ea45c813d6e808a68c33a83c17a4a77ccc28f119316de467956d20c0c" Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.208698 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591298-z2tn8" Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.596316 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591292-h54f6"] Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.604114 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591292-h54f6"] Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.684619 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" path="/var/lib/kubelet/pods/5c17f148-f3a8-4e44-b9a0-ea95774a05f1/volumes" Apr 06 12:18:05 crc kubenswrapper[4790]: I0406 12:18:05.685227 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a167fd67-1708-4c3b-a2f2-429b58ce961a" path="/var/lib/kubelet/pods/a167fd67-1708-4c3b-a2f2-429b58ce961a/volumes" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.122012 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.440036 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.725950 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-j4vzr"] Apr 06 12:18:07 crc kubenswrapper[4790]: E0406 12:18:07.726351 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="init" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.726376 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="init" Apr 06 12:18:07 crc kubenswrapper[4790]: E0406 12:18:07.726399 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="dnsmasq-dns" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.726408 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="dnsmasq-dns" Apr 06 12:18:07 crc kubenswrapper[4790]: E0406 12:18:07.726436 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acf1b39-821a-435f-9943-d6f12210cc0a" containerName="oc" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.726444 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acf1b39-821a-435f-9943-d6f12210cc0a" containerName="oc" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.726629 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="dnsmasq-dns" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.726674 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acf1b39-821a-435f-9943-d6f12210cc0a" containerName="oc" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.727397 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.772393 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.772525 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.772545 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.772786 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5jkj" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.781293 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j4vzr"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.796041 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/notifications-rabbitmq-server-0" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.819201 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5xzzk"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.820667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.830055 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5xzzk"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.844818 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f124-account-create-update-rflf8"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.846307 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.849109 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.870719 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f124-account-create-update-rflf8"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.875250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.875319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.875369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtn7\" (UniqueName: \"kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.917495 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bpckw"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.918710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.934852 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ffac-account-create-update-gbz2f"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.936286 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.939501 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.944854 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bpckw"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.975203 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ffac-account-create-update-gbz2f"] Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtn7\" (UniqueName: \"kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hflrh\" (UniqueName: \"kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jg7r\" (UniqueName: \"kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:07 crc kubenswrapper[4790]: I0406 12:18:07.980476 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.002917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.003574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.006458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtn7\" (UniqueName: \"kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7\") pod \"keystone-db-sync-j4vzr\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hflrh\" (UniqueName: \"kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jg7r\" (UniqueName: \"kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081863 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkwxw\" (UniqueName: \"kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.081956 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdx8\" (UniqueName: \"kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.082715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.082981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.084412 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.100493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hflrh\" (UniqueName: \"kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh\") pod \"barbican-db-create-5xzzk\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.111879 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jg7r\" (UniqueName: \"kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r\") pod \"barbican-f124-account-create-update-rflf8\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.143234 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.168208 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.185767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.186046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkwxw\" (UniqueName: \"kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.186123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.186209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdx8\" (UniqueName: \"kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.187079 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.187252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.215411 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdx8\" (UniqueName: \"kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8\") pod \"cinder-ffac-account-create-update-gbz2f\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.229221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkwxw\" (UniqueName: \"kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw\") pod \"cinder-db-create-bpckw\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.247260 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.260013 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-756fdd77c5-2qxts" podUID="5c17f148-f3a8-4e44-b9a0-ea95774a05f1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: i/o timeout" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.303864 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.729030 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-j4vzr"] Apr 06 12:18:08 crc kubenswrapper[4790]: W0406 12:18:08.885417 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0e7386_a123_4a3c_a175_f4a89ab43a27.slice/crio-d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32 WatchSource:0}: Error finding container d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32: Status 404 returned error can't find the container with id d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32 Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.893271 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5xzzk"] Apr 06 12:18:08 crc kubenswrapper[4790]: W0406 12:18:08.900478 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd59a84b5_d0bb_4d73_ada1_e7982fd86f50.slice/crio-b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042 WatchSource:0}: Error finding container b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042: Status 404 returned error can't find the container with id b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042 Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.906918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f124-account-create-update-rflf8"] Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.953567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bpckw"] Apr 06 12:18:08 crc kubenswrapper[4790]: W0406 12:18:08.959457 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode61172bd_20f5_49e9_886b_01c8cf8ce7cc.slice/crio-73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854 WatchSource:0}: Error finding container 73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854: Status 404 returned error can't find the container with id 73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854 Apr 06 12:18:08 crc kubenswrapper[4790]: I0406 12:18:08.995324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ffac-account-create-update-gbz2f"] Apr 06 12:18:09 crc kubenswrapper[4790]: W0406 12:18:09.003566 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ef1835f_4003_4fe1_9157_aac7e54d5f94.slice/crio-1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc WatchSource:0}: Error finding container 1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc: Status 404 returned error can't find the container with id 1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.265639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ffac-account-create-update-gbz2f" event={"ID":"1ef1835f-4003-4fe1-9157-aac7e54d5f94","Type":"ContainerStarted","Data":"3d10f0528a4c4d40d78c00caf97f968f7339063019b97f60497a9d2aae1a4e5f"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.266004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ffac-account-create-update-gbz2f" event={"ID":"1ef1835f-4003-4fe1-9157-aac7e54d5f94","Type":"ContainerStarted","Data":"1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.268020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpckw" event={"ID":"e61172bd-20f5-49e9-886b-01c8cf8ce7cc","Type":"ContainerStarted","Data":"8b7f10527fc797bce82e9bc8de68b45de4b7c4dce8cee9f414f59c8d0e5b3f6d"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.268052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpckw" event={"ID":"e61172bd-20f5-49e9-886b-01c8cf8ce7cc","Type":"ContainerStarted","Data":"73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.272248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5xzzk" event={"ID":"1b0e7386-a123-4a3c-a175-f4a89ab43a27","Type":"ContainerStarted","Data":"903969f63c571ecb0cce7af8395dd4793943db9b210217883734bb9e5176f6d9"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.272292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5xzzk" event={"ID":"1b0e7386-a123-4a3c-a175-f4a89ab43a27","Type":"ContainerStarted","Data":"d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.280982 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f124-account-create-update-rflf8" event={"ID":"d59a84b5-d0bb-4d73-ada1-e7982fd86f50","Type":"ContainerStarted","Data":"9e2b79d59e01f7d6364ee63f9b1187bfcc162e73e96dcd12c19b27b2a6bdc0cd"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.281040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f124-account-create-update-rflf8" event={"ID":"d59a84b5-d0bb-4d73-ada1-e7982fd86f50","Type":"ContainerStarted","Data":"b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.285658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j4vzr" event={"ID":"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01","Type":"ContainerStarted","Data":"eaaca040692047ba3e6c455c47d8c449217f6bdef0b17d16fb59b1ab029c6e8d"} Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.295115 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-bpckw" podStartSLOduration=2.295093962 podStartE2EDuration="2.295093962s" podCreationTimestamp="2026-04-06 12:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:09.281034712 +0000 UTC m=+1268.268777578" watchObservedRunningTime="2026-04-06 12:18:09.295093962 +0000 UTC m=+1268.282836828" Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.299192 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-f124-account-create-update-rflf8" podStartSLOduration=2.299169619 podStartE2EDuration="2.299169619s" podCreationTimestamp="2026-04-06 12:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:09.296710104 +0000 UTC m=+1268.284452970" watchObservedRunningTime="2026-04-06 12:18:09.299169619 +0000 UTC m=+1268.286912485" Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.327615 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-5xzzk" podStartSLOduration=2.327590277 podStartE2EDuration="2.327590277s" podCreationTimestamp="2026-04-06 12:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:09.319777901 +0000 UTC m=+1268.307520787" watchObservedRunningTime="2026-04-06 12:18:09.327590277 +0000 UTC m=+1268.315333143" Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.753323 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:18:09 crc kubenswrapper[4790]: I0406 12:18:09.753393 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.304513 4790 generic.go:334] "Generic (PLEG): container finished" podID="d59a84b5-d0bb-4d73-ada1-e7982fd86f50" containerID="9e2b79d59e01f7d6364ee63f9b1187bfcc162e73e96dcd12c19b27b2a6bdc0cd" exitCode=0 Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.304570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f124-account-create-update-rflf8" event={"ID":"d59a84b5-d0bb-4d73-ada1-e7982fd86f50","Type":"ContainerDied","Data":"9e2b79d59e01f7d6364ee63f9b1187bfcc162e73e96dcd12c19b27b2a6bdc0cd"} Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.307260 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ef1835f-4003-4fe1-9157-aac7e54d5f94" containerID="3d10f0528a4c4d40d78c00caf97f968f7339063019b97f60497a9d2aae1a4e5f" exitCode=0 Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.307307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ffac-account-create-update-gbz2f" event={"ID":"1ef1835f-4003-4fe1-9157-aac7e54d5f94","Type":"ContainerDied","Data":"3d10f0528a4c4d40d78c00caf97f968f7339063019b97f60497a9d2aae1a4e5f"} Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.307324 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-8r6lc"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.308416 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.312969 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-g7jzm" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.313207 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.320020 4790 generic.go:334] "Generic (PLEG): container finished" podID="e61172bd-20f5-49e9-886b-01c8cf8ce7cc" containerID="8b7f10527fc797bce82e9bc8de68b45de4b7c4dce8cee9f414f59c8d0e5b3f6d" exitCode=0 Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.320422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpckw" event={"ID":"e61172bd-20f5-49e9-886b-01c8cf8ce7cc","Type":"ContainerDied","Data":"8b7f10527fc797bce82e9bc8de68b45de4b7c4dce8cee9f414f59c8d0e5b3f6d"} Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.326036 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8r6lc"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.350626 4790 generic.go:334] "Generic (PLEG): container finished" podID="1b0e7386-a123-4a3c-a175-f4a89ab43a27" containerID="903969f63c571ecb0cce7af8395dd4793943db9b210217883734bb9e5176f6d9" exitCode=0 Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.350676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5xzzk" event={"ID":"1b0e7386-a123-4a3c-a175-f4a89ab43a27","Type":"ContainerDied","Data":"903969f63c571ecb0cce7af8395dd4793943db9b210217883734bb9e5176f6d9"} Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.424760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.424822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp6lv\" (UniqueName: \"kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.424940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.425185 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.514271 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jwff6"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.515460 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.526589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.526851 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.526915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.526940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp6lv\" (UniqueName: \"kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.535702 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.544859 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.554425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp6lv\" (UniqueName: \"kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.566259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle\") pod \"watcher-db-sync-8r6lc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.591020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jwff6"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.625673 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.628368 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.628454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghkpb\" (UniqueName: \"kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.639107 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8bc9-account-create-update-dm9bc"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.644468 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.645919 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.662608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bc9-account-create-update-dm9bc"] Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.730362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.730470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.730509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqz5d\" (UniqueName: \"kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.730548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghkpb\" (UniqueName: \"kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.731648 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.748327 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghkpb\" (UniqueName: \"kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb\") pod \"neutron-db-create-jwff6\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.832016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqz5d\" (UniqueName: \"kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.832131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.836073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.854113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.854506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqz5d\" (UniqueName: \"kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d\") pod \"neutron-8bc9-account-create-update-dm9bc\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:10 crc kubenswrapper[4790]: I0406 12:18:10.965367 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.404661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ffac-account-create-update-gbz2f" event={"ID":"1ef1835f-4003-4fe1-9157-aac7e54d5f94","Type":"ContainerDied","Data":"1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc"} Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.405552 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1246c21ecb31381cca4ec22a30638e74e4e89432b868539a3c2aef54ef2f8ffc" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.414205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bpckw" event={"ID":"e61172bd-20f5-49e9-886b-01c8cf8ce7cc","Type":"ContainerDied","Data":"73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854"} Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.414251 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73063cbd9a4c3e91dc2a1043fe7775d671c08beaaaffb6fdc59c74dc50b69854" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.427072 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.433891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5xzzk" event={"ID":"1b0e7386-a123-4a3c-a175-f4a89ab43a27","Type":"ContainerDied","Data":"d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32"} Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.433927 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b601a612d17253669cfa5459e120b0446fc55cb9e22ed6363a936087e22a32" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.435891 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.436740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f124-account-create-update-rflf8" event={"ID":"d59a84b5-d0bb-4d73-ada1-e7982fd86f50","Type":"ContainerDied","Data":"b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042"} Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.436781 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5fcc5197a20ad299342029c659c06eb53be5e8428708cc5c87539737f0a6042" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.573460 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.580985 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.589647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jg7r\" (UniqueName: \"kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r\") pod \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.589713 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts\") pod \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\" (UID: \"d59a84b5-d0bb-4d73-ada1-e7982fd86f50\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.589759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkwxw\" (UniqueName: \"kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw\") pod \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.589793 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts\") pod \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\" (UID: \"e61172bd-20f5-49e9-886b-01c8cf8ce7cc\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.591704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e61172bd-20f5-49e9-886b-01c8cf8ce7cc" (UID: "e61172bd-20f5-49e9-886b-01c8cf8ce7cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.596690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d59a84b5-d0bb-4d73-ada1-e7982fd86f50" (UID: "d59a84b5-d0bb-4d73-ada1-e7982fd86f50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.597109 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r" (OuterVolumeSpecName: "kube-api-access-8jg7r") pod "d59a84b5-d0bb-4d73-ada1-e7982fd86f50" (UID: "d59a84b5-d0bb-4d73-ada1-e7982fd86f50"). InnerVolumeSpecName "kube-api-access-8jg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.604047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw" (OuterVolumeSpecName: "kube-api-access-mkwxw") pod "e61172bd-20f5-49e9-886b-01c8cf8ce7cc" (UID: "e61172bd-20f5-49e9-886b-01c8cf8ce7cc"). InnerVolumeSpecName "kube-api-access-mkwxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.691747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kdx8\" (UniqueName: \"kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8\") pod \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.691920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts\") pod \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\" (UID: \"1ef1835f-4003-4fe1-9157-aac7e54d5f94\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.692017 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hflrh\" (UniqueName: \"kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh\") pod \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.692046 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts\") pod \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\" (UID: \"1b0e7386-a123-4a3c-a175-f4a89ab43a27\") " Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693004 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ef1835f-4003-4fe1-9157-aac7e54d5f94" (UID: "1ef1835f-4003-4fe1-9157-aac7e54d5f94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693272 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b0e7386-a123-4a3c-a175-f4a89ab43a27" (UID: "1b0e7386-a123-4a3c-a175-f4a89ab43a27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693512 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0e7386-a123-4a3c-a175-f4a89ab43a27-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693533 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jg7r\" (UniqueName: \"kubernetes.io/projected/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-kube-api-access-8jg7r\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693548 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d59a84b5-d0bb-4d73-ada1-e7982fd86f50-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693562 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkwxw\" (UniqueName: \"kubernetes.io/projected/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-kube-api-access-mkwxw\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693574 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e61172bd-20f5-49e9-886b-01c8cf8ce7cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.693586 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ef1835f-4003-4fe1-9157-aac7e54d5f94-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.696612 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8" (OuterVolumeSpecName: "kube-api-access-7kdx8") pod "1ef1835f-4003-4fe1-9157-aac7e54d5f94" (UID: "1ef1835f-4003-4fe1-9157-aac7e54d5f94"). InnerVolumeSpecName "kube-api-access-7kdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.696650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh" (OuterVolumeSpecName: "kube-api-access-hflrh") pod "1b0e7386-a123-4a3c-a175-f4a89ab43a27" (UID: "1b0e7386-a123-4a3c-a175-f4a89ab43a27"). InnerVolumeSpecName "kube-api-access-hflrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.750386 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bc9-account-create-update-dm9bc"] Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.795530 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hflrh\" (UniqueName: \"kubernetes.io/projected/1b0e7386-a123-4a3c-a175-f4a89ab43a27-kube-api-access-hflrh\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.795559 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kdx8\" (UniqueName: \"kubernetes.io/projected/1ef1835f-4003-4fe1-9157-aac7e54d5f94-kube-api-access-7kdx8\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.813645 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8r6lc"] Apr 06 12:18:13 crc kubenswrapper[4790]: I0406 12:18:13.877089 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jwff6"] Apr 06 12:18:13 crc kubenswrapper[4790]: W0406 12:18:13.882113 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d55899_25c2_467d_ab1b_202029b76a86.slice/crio-c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054 WatchSource:0}: Error finding container c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054: Status 404 returned error can't find the container with id c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054 Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.446472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j4vzr" event={"ID":"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01","Type":"ContainerStarted","Data":"34a822eed379db6c614f2b17c23ca0e4ef2502736e08159981bce01aef8fcceb"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.449763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8r6lc" event={"ID":"ccc30661-4ea6-4218-b75d-b59f473d41bc","Type":"ContainerStarted","Data":"c9164921f59172ba03f91c9c8c8549499bb016d490e45a77b50fe03feaa95c85"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.457741 4790 generic.go:334] "Generic (PLEG): container finished" podID="d41bab77-7a21-4962-bd9b-82fa33b47aaa" containerID="6ba15c503dadda2929be103c5c54a33f67c17e6c345a48acb0408358340d4058" exitCode=0 Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.457817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bc9-account-create-update-dm9bc" event={"ID":"d41bab77-7a21-4962-bd9b-82fa33b47aaa","Type":"ContainerDied","Data":"6ba15c503dadda2929be103c5c54a33f67c17e6c345a48acb0408358340d4058"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.457855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bc9-account-create-update-dm9bc" event={"ID":"d41bab77-7a21-4962-bd9b-82fa33b47aaa","Type":"ContainerStarted","Data":"c40e0651e0f84a90368a8ad77fffe8d37cbd7dd5f0ea6f318f5b715ba326db4a"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459579 4790 generic.go:334] "Generic (PLEG): container finished" podID="85d55899-25c2-467d-ab1b-202029b76a86" containerID="331a4e9a44a7941c08218957f7d03252d32130bb800e78c1f49d02d5deaae5dd" exitCode=0 Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jwff6" event={"ID":"85d55899-25c2-467d-ab1b-202029b76a86","Type":"ContainerDied","Data":"331a4e9a44a7941c08218957f7d03252d32130bb800e78c1f49d02d5deaae5dd"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459905 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f124-account-create-update-rflf8" Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5xzzk" Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459876 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ffac-account-create-update-gbz2f" Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.459914 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jwff6" event={"ID":"85d55899-25c2-467d-ab1b-202029b76a86","Type":"ContainerStarted","Data":"c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054"} Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.461057 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bpckw" Apr 06 12:18:14 crc kubenswrapper[4790]: I0406 12:18:14.492557 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-j4vzr" podStartSLOduration=2.93483815 podStartE2EDuration="7.492537223s" podCreationTimestamp="2026-04-06 12:18:07 +0000 UTC" firstStartedPulling="2026-04-06 12:18:08.732490605 +0000 UTC m=+1267.720233471" lastFinishedPulling="2026-04-06 12:18:13.290189668 +0000 UTC m=+1272.277932544" observedRunningTime="2026-04-06 12:18:14.475038182 +0000 UTC m=+1273.462781048" watchObservedRunningTime="2026-04-06 12:18:14.492537223 +0000 UTC m=+1273.480280089" Apr 06 12:18:17 crc kubenswrapper[4790]: I0406 12:18:17.495650 4790 generic.go:334] "Generic (PLEG): container finished" podID="68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" containerID="34a822eed379db6c614f2b17c23ca0e4ef2502736e08159981bce01aef8fcceb" exitCode=0 Apr 06 12:18:17 crc kubenswrapper[4790]: I0406 12:18:17.496154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j4vzr" event={"ID":"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01","Type":"ContainerDied","Data":"34a822eed379db6c614f2b17c23ca0e4ef2502736e08159981bce01aef8fcceb"} Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.759709 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.892748 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts\") pod \"85d55899-25c2-467d-ab1b-202029b76a86\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.892903 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghkpb\" (UniqueName: \"kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb\") pod \"85d55899-25c2-467d-ab1b-202029b76a86\" (UID: \"85d55899-25c2-467d-ab1b-202029b76a86\") " Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.894009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85d55899-25c2-467d-ab1b-202029b76a86" (UID: "85d55899-25c2-467d-ab1b-202029b76a86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.898290 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb" (OuterVolumeSpecName: "kube-api-access-ghkpb") pod "85d55899-25c2-467d-ab1b-202029b76a86" (UID: "85d55899-25c2-467d-ab1b-202029b76a86"). InnerVolumeSpecName "kube-api-access-ghkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.994902 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d55899-25c2-467d-ab1b-202029b76a86-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:18 crc kubenswrapper[4790]: I0406 12:18:18.994991 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghkpb\" (UniqueName: \"kubernetes.io/projected/85d55899-25c2-467d-ab1b-202029b76a86-kube-api-access-ghkpb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.430605 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.439681 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.506132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqz5d\" (UniqueName: \"kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d\") pod \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.506185 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts\") pod \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\" (UID: \"d41bab77-7a21-4962-bd9b-82fa33b47aaa\") " Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.507059 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d41bab77-7a21-4962-bd9b-82fa33b47aaa" (UID: "d41bab77-7a21-4962-bd9b-82fa33b47aaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.513045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d" (OuterVolumeSpecName: "kube-api-access-vqz5d") pod "d41bab77-7a21-4962-bd9b-82fa33b47aaa" (UID: "d41bab77-7a21-4962-bd9b-82fa33b47aaa"). InnerVolumeSpecName "kube-api-access-vqz5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.516027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bc9-account-create-update-dm9bc" event={"ID":"d41bab77-7a21-4962-bd9b-82fa33b47aaa","Type":"ContainerDied","Data":"c40e0651e0f84a90368a8ad77fffe8d37cbd7dd5f0ea6f318f5b715ba326db4a"} Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.516069 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40e0651e0f84a90368a8ad77fffe8d37cbd7dd5f0ea6f318f5b715ba326db4a" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.516130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bc9-account-create-update-dm9bc" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.518295 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jwff6" event={"ID":"85d55899-25c2-467d-ab1b-202029b76a86","Type":"ContainerDied","Data":"c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054"} Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.518338 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c226ba08e33e35089e127e92520b46e870fd202827c285eb0d6c6708f59a1054" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.518388 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jwff6" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.521285 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-j4vzr" event={"ID":"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01","Type":"ContainerDied","Data":"eaaca040692047ba3e6c455c47d8c449217f6bdef0b17d16fb59b1ab029c6e8d"} Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.521324 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaaca040692047ba3e6c455c47d8c449217f6bdef0b17d16fb59b1ab029c6e8d" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.521379 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-j4vzr" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.608086 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data\") pod \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.608165 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle\") pod \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.608215 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtn7\" (UniqueName: \"kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7\") pod \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\" (UID: \"68c696c2-cc67-4dce-9b9a-5c5ec2c14f01\") " Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.608559 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqz5d\" (UniqueName: \"kubernetes.io/projected/d41bab77-7a21-4962-bd9b-82fa33b47aaa-kube-api-access-vqz5d\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.608576 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41bab77-7a21-4962-bd9b-82fa33b47aaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.612974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7" (OuterVolumeSpecName: "kube-api-access-rbtn7") pod "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" (UID: "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01"). InnerVolumeSpecName "kube-api-access-rbtn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.640011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" (UID: "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.687214 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data" (OuterVolumeSpecName: "config-data") pod "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" (UID: "68c696c2-cc67-4dce-9b9a-5c5ec2c14f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.712903 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.712939 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.712953 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbtn7\" (UniqueName: \"kubernetes.io/projected/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01-kube-api-access-rbtn7\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.828842 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d55899-25c2-467d-ab1b-202029b76a86" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829213 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d55899-25c2-467d-ab1b-202029b76a86" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829228 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" containerName="keystone-db-sync" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829234 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" containerName="keystone-db-sync" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829247 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41bab77-7a21-4962-bd9b-82fa33b47aaa" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829252 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41bab77-7a21-4962-bd9b-82fa33b47aaa" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829261 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a84b5-d0bb-4d73-ada1-e7982fd86f50" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829267 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a84b5-d0bb-4d73-ada1-e7982fd86f50" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829281 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0e7386-a123-4a3c-a175-f4a89ab43a27" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0e7386-a123-4a3c-a175-f4a89ab43a27" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829304 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef1835f-4003-4fe1-9157-aac7e54d5f94" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829310 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef1835f-4003-4fe1-9157-aac7e54d5f94" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: E0406 12:18:19.829324 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61172bd-20f5-49e9-886b-01c8cf8ce7cc" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829329 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61172bd-20f5-49e9-886b-01c8cf8ce7cc" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829511 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61172bd-20f5-49e9-886b-01c8cf8ce7cc" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829527 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d55899-25c2-467d-ab1b-202029b76a86" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829539 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" containerName="keystone-db-sync" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829551 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0e7386-a123-4a3c-a175-f4a89ab43a27" containerName="mariadb-database-create" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829571 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef1835f-4003-4fe1-9157-aac7e54d5f94" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829586 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a84b5-d0bb-4d73-ada1-e7982fd86f50" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.829599 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41bab77-7a21-4962-bd9b-82fa33b47aaa" containerName="mariadb-account-create-update" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.830816 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.866493 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.898181 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rjzmq"] Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.899674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.908157 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.908395 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.908600 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5jkj" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.908347 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.911341 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.918461 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rjzmq"] Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.930965 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.931035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.931087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mrh\" (UniqueName: \"kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.931204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.931223 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:19 crc kubenswrapper[4790]: I0406 12:18:19.931267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.002067 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6crt4"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.006273 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.009798 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.010244 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q84z7" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.010475 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.030952 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.033334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.033395 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.033417 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.033399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.033446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.034503 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.034637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8vx\" (UniqueName: \"kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.034705 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.034772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036032 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mrh\" (UniqueName: \"kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036194 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.036452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.037064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.037970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.038374 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.038419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.038635 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.043605 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6crt4"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.061139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mrh\" (UniqueName: \"kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh\") pod \"dnsmasq-dns-7fd574987c-9wv6g\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.111670 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.123572 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-h5v54"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.124735 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.135557 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.135732 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqxqr" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.138471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.138685 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.138811 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.138924 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftx5g\" (UniqueName: \"kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139225 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139396 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.139484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfsn4\" (UniqueName: \"kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142159 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142259 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8vx\" (UniqueName: \"kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.145805 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.142368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.146340 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h5v54"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.163962 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.164647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.167372 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.170167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.183104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8vx\" (UniqueName: \"kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx\") pod \"keystone-bootstrap-rjzmq\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.204692 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-r77xt"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.205845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.209948 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.211237 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.212286 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qr248" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.225602 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.230365 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-r77xt"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvngs\" (UniqueName: \"kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243791 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243818 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftx5g\" (UniqueName: \"kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.243990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfsn4\" (UniqueName: \"kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.244162 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.247473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.260261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.260542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.262443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.270469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.273151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.280725 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.280886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.284016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.285688 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfsn4\" (UniqueName: \"kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4\") pod \"cinder-db-sync-6crt4\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.287100 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.296422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.296508 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.296660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftx5g\" (UniqueName: \"kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.297084 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data\") pod \"ceilometer-0\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.304550 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.304673 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.337060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6crt4" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvngs\" (UniqueName: \"kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349404 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349551 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.349570 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.354188 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.362484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.362639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.391753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvngs\" (UniqueName: \"kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs\") pod \"barbican-db-sync-h5v54\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.464914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465700 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465722 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pdd\" (UniqueName: \"kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465768 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.465972 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.466059 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.466100 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.467672 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.476937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.481473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.486302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.491821 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h\") pod \"placement-db-sync-r77xt\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.552416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8r6lc" event={"ID":"ccc30661-4ea6-4218-b75d-b59f473d41bc","Type":"ContainerStarted","Data":"9640f14755782c1f3b2eea665acc3ab90b0cc9a5365e019256d29ad7b00fae78"} Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.568624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.568744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.568782 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.568880 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pdd\" (UniqueName: \"kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.568975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.569044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.569802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.569812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.575794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.578300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.579191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.590263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.596125 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pdd\" (UniqueName: \"kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd\") pod \"dnsmasq-dns-66f4cf67b9-vprjw\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.616135 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.639971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.818255 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-8r6lc" podStartSLOduration=5.20790387 podStartE2EDuration="10.818237137s" podCreationTimestamp="2026-04-06 12:18:10 +0000 UTC" firstStartedPulling="2026-04-06 12:18:13.819764116 +0000 UTC m=+1272.807506982" lastFinishedPulling="2026-04-06 12:18:19.430097393 +0000 UTC m=+1278.417840249" observedRunningTime="2026-04-06 12:18:20.570091406 +0000 UTC m=+1279.557834282" watchObservedRunningTime="2026-04-06 12:18:20.818237137 +0000 UTC m=+1279.805980003" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.830662 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rjzmq"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.920077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wl5rj"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.922083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.923534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.923696 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v8r6x" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.923812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.932093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wl5rj"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.941207 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.943585 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.957101 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.957210 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.957327 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4xzjn" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.957358 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 06 12:18:20 crc kubenswrapper[4790]: I0406 12:18:20.971365 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.075006 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104387 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvq5r\" (UniqueName: \"kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104489 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8j8c\" (UniqueName: \"kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.104692 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.106170 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.108024 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.109423 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.116124 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:21 crc kubenswrapper[4790]: W0406 12:18:21.160079 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0dc22e_42cc_4916_a598_99d4e2c99ae7.slice/crio-e0a118a29d6048e9af5f3a073b228080ade80a4fd81a341f1422aba7c748a02a WatchSource:0}: Error finding container e0a118a29d6048e9af5f3a073b228080ade80a4fd81a341f1422aba7c748a02a: Status 404 returned error can't find the container with id e0a118a29d6048e9af5f3a073b228080ade80a4fd81a341f1422aba7c748a02a Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.160143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6crt4"] Apr 06 12:18:21 crc kubenswrapper[4790]: W0406 12:18:21.171341 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25adb66a_8db5_48e1_b05c_526008a22e4f.slice/crio-a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174 WatchSource:0}: Error finding container a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174: Status 404 returned error can't find the container with id a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174 Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.196870 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvq5r\" (UniqueName: \"kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206863 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206879 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsn9\" (UniqueName: \"kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206899 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206939 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.206973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.207668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8j8c\" (UniqueName: \"kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.208028 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.208054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.210227 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.210309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.210347 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.210376 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.211560 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.211601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.211636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.211659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.209286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.209685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.213015 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.227285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.227710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.228609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.234209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.238509 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.240346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8j8c\" (UniqueName: \"kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c\") pod \"neutron-db-sync-wl5rj\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.245727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvq5r\" (UniqueName: \"kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.257138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.276686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.298885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-h5v54"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.309030 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-r77xt"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312711 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312731 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsn9\" (UniqueName: \"kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.312856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.313103 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.313406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.314337 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.331293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.335632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsn9\" (UniqueName: \"kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.335726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.340094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.345027 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.346622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.361558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.510022 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.513636 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.534189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.609345 4790 generic.go:334] "Generic (PLEG): container finished" podID="92b99b17-a37c-48d3-8959-e1abb343e73b" containerID="56adf8c5fc78d0742c92901fa8eb933564c8ccf826bf88d7b744ac8c2a3b40bf" exitCode=0 Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.609652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" event={"ID":"92b99b17-a37c-48d3-8959-e1abb343e73b","Type":"ContainerDied","Data":"56adf8c5fc78d0742c92901fa8eb933564c8ccf826bf88d7b744ac8c2a3b40bf"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.609678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" event={"ID":"92b99b17-a37c-48d3-8959-e1abb343e73b","Type":"ContainerStarted","Data":"f41e57b516588fdbebcd00b1b0bb510f839fcb8e347f6dcfc020963019f867e4"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.623382 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r77xt" event={"ID":"60672a42-148c-45e2-99a8-da1ebec40dbb","Type":"ContainerStarted","Data":"70b156159aadd855633ec58ce7c4ea54bb3575eff1c4d04be6839a5943f18f5e"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.666949 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjzmq" event={"ID":"da1a10aa-eab1-477b-bcca-34d62b163b2d","Type":"ContainerStarted","Data":"62d024d1e1a7544111c81c038d7f9c09e5dcaa97d0b467a41e26e87be1ea8fb2"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.666990 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjzmq" event={"ID":"da1a10aa-eab1-477b-bcca-34d62b163b2d","Type":"ContainerStarted","Data":"80a65c28c80d4be59dc10b6a9675e3a090d40820a858d7917f3f50ee59a435dd"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.703307 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rjzmq" podStartSLOduration=2.703286651 podStartE2EDuration="2.703286651s" podCreationTimestamp="2026-04-06 12:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:21.68919269 +0000 UTC m=+1280.676935576" watchObservedRunningTime="2026-04-06 12:18:21.703286651 +0000 UTC m=+1280.691029517" Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.742045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" event={"ID":"056b7341-e602-414d-b17a-ae4a8419a741","Type":"ContainerStarted","Data":"c71e8f6a906b28d2886c6865f1828f29556ab0544a3e13fead81dbd8cfc9ea1a"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.742084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h5v54" event={"ID":"e4c585de-2005-495a-a987-8cfe70dc8793","Type":"ContainerStarted","Data":"7ea73b328ae601d4ef9dc45b661d9a07281dacfc17ee4c3723d7e348a1960c8f"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.742095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6crt4" event={"ID":"25adb66a-8db5-48e1-b05c-526008a22e4f","Type":"ContainerStarted","Data":"a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.742114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerStarted","Data":"e0a118a29d6048e9af5f3a073b228080ade80a4fd81a341f1422aba7c748a02a"} Apr 06 12:18:21 crc kubenswrapper[4790]: I0406 12:18:21.938705 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wl5rj"] Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.208986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.297120 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:22 crc kubenswrapper[4790]: W0406 12:18:22.300766 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod142e9f6e_49b8_4182_917a_031985b1dfc7.slice/crio-5c84633c2a64c3ee9401584316d65cf68ac29f0a358155e6c7d0ba987b62c81e WatchSource:0}: Error finding container 5c84633c2a64c3ee9401584316d65cf68ac29f0a358155e6c7d0ba987b62c81e: Status 404 returned error can't find the container with id 5c84633c2a64c3ee9401584316d65cf68ac29f0a358155e6c7d0ba987b62c81e Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.365429 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.365766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.365792 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.365853 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.365928 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8mrh\" (UniqueName: \"kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.366001 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc\") pod \"92b99b17-a37c-48d3-8959-e1abb343e73b\" (UID: \"92b99b17-a37c-48d3-8959-e1abb343e73b\") " Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.390209 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh" (OuterVolumeSpecName: "kube-api-access-p8mrh") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "kube-api-access-p8mrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.405366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.412256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config" (OuterVolumeSpecName: "config") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.421265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.432229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.441496 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "92b99b17-a37c-48d3-8959-e1abb343e73b" (UID: "92b99b17-a37c-48d3-8959-e1abb343e73b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467742 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467777 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467786 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467794 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467805 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8mrh\" (UniqueName: \"kubernetes.io/projected/92b99b17-a37c-48d3-8959-e1abb343e73b-kube-api-access-p8mrh\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.467813 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92b99b17-a37c-48d3-8959-e1abb343e73b-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.470464 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:22 crc kubenswrapper[4790]: W0406 12:18:22.481318 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c6adc2f_5636_41aa_ac84_2471261f3a19.slice/crio-f13e1aacf015d12009c46ec73b48392d438e3d0ca2cf01d3d43bebc4e71d8532 WatchSource:0}: Error finding container f13e1aacf015d12009c46ec73b48392d438e3d0ca2cf01d3d43bebc4e71d8532: Status 404 returned error can't find the container with id f13e1aacf015d12009c46ec73b48392d438e3d0ca2cf01d3d43bebc4e71d8532 Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.743458 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.818121 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.843551 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.880654 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wl5rj" event={"ID":"e752bf76-0eff-4559-b599-6ab462cea81e","Type":"ContainerStarted","Data":"d13452abb359ca2338c1acedc317f38a7cf7097878ef34b2eaa3d8722957c828"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.880705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wl5rj" event={"ID":"e752bf76-0eff-4559-b599-6ab462cea81e","Type":"ContainerStarted","Data":"076d1c811a47ba793b763df05d98e515abe6005ebf3b28df185dce0f6b3928de"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.910716 4790 generic.go:334] "Generic (PLEG): container finished" podID="056b7341-e602-414d-b17a-ae4a8419a741" containerID="a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee" exitCode=0 Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.910842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" event={"ID":"056b7341-e602-414d-b17a-ae4a8419a741","Type":"ContainerDied","Data":"a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.913562 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wl5rj" podStartSLOduration=2.913545603 podStartE2EDuration="2.913545603s" podCreationTimestamp="2026-04-06 12:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:22.909533257 +0000 UTC m=+1281.897276123" watchObservedRunningTime="2026-04-06 12:18:22.913545603 +0000 UTC m=+1281.901288469" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.913679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerStarted","Data":"5c84633c2a64c3ee9401584316d65cf68ac29f0a358155e6c7d0ba987b62c81e"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.921117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerStarted","Data":"f13e1aacf015d12009c46ec73b48392d438e3d0ca2cf01d3d43bebc4e71d8532"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.963627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.964848 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd574987c-9wv6g" event={"ID":"92b99b17-a37c-48d3-8959-e1abb343e73b","Type":"ContainerDied","Data":"f41e57b516588fdbebcd00b1b0bb510f839fcb8e347f6dcfc020963019f867e4"} Apr 06 12:18:22 crc kubenswrapper[4790]: I0406 12:18:22.964925 4790 scope.go:117] "RemoveContainer" containerID="56adf8c5fc78d0742c92901fa8eb933564c8ccf826bf88d7b744ac8c2a3b40bf" Apr 06 12:18:23 crc kubenswrapper[4790]: I0406 12:18:23.203891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:23 crc kubenswrapper[4790]: I0406 12:18:23.231742 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd574987c-9wv6g"] Apr 06 12:18:23 crc kubenswrapper[4790]: I0406 12:18:23.694637 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b99b17-a37c-48d3-8959-e1abb343e73b" path="/var/lib/kubelet/pods/92b99b17-a37c-48d3-8959-e1abb343e73b/volumes" Apr 06 12:18:23 crc kubenswrapper[4790]: I0406 12:18:23.990200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerStarted","Data":"5dfeb517b5a198aef4052864be63fb870699e3dee7f662afcfbdaa68af92f878"} Apr 06 12:18:24 crc kubenswrapper[4790]: I0406 12:18:24.000392 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerStarted","Data":"6510e79ef493c973633446db8d83d7f5276ff2b2505be1414b03d32e9d44cf25"} Apr 06 12:18:24 crc kubenswrapper[4790]: I0406 12:18:24.022132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" event={"ID":"056b7341-e602-414d-b17a-ae4a8419a741","Type":"ContainerStarted","Data":"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8"} Apr 06 12:18:24 crc kubenswrapper[4790]: I0406 12:18:24.022215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:24 crc kubenswrapper[4790]: I0406 12:18:24.052018 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" podStartSLOduration=4.051999376 podStartE2EDuration="4.051999376s" podCreationTimestamp="2026-04-06 12:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:24.041203281 +0000 UTC m=+1283.028946147" watchObservedRunningTime="2026-04-06 12:18:24.051999376 +0000 UTC m=+1283.039742232" Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.031404 4790 generic.go:334] "Generic (PLEG): container finished" podID="ccc30661-4ea6-4218-b75d-b59f473d41bc" containerID="9640f14755782c1f3b2eea665acc3ab90b0cc9a5365e019256d29ad7b00fae78" exitCode=0 Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.031449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8r6lc" event={"ID":"ccc30661-4ea6-4218-b75d-b59f473d41bc","Type":"ContainerDied","Data":"9640f14755782c1f3b2eea665acc3ab90b0cc9a5365e019256d29ad7b00fae78"} Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.037128 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-log" containerID="cri-o://5dfeb517b5a198aef4052864be63fb870699e3dee7f662afcfbdaa68af92f878" gracePeriod=30 Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.037227 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-httpd" containerID="cri-o://d0bb1eac34dd41488f67b9f33d3c19f7cd5eda8351b37f05dade76ab92c5ef05" gracePeriod=30 Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.037148 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerStarted","Data":"d0bb1eac34dd41488f67b9f33d3c19f7cd5eda8351b37f05dade76ab92c5ef05"} Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.045126 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-log" containerID="cri-o://6510e79ef493c973633446db8d83d7f5276ff2b2505be1414b03d32e9d44cf25" gracePeriod=30 Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.045290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerStarted","Data":"8ed8476757b8f5d820a2834f300d930e994e1844e6295ac0b225f0f503479647"} Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.045350 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-httpd" containerID="cri-o://8ed8476757b8f5d820a2834f300d930e994e1844e6295ac0b225f0f503479647" gracePeriod=30 Apr 06 12:18:25 crc kubenswrapper[4790]: I0406 12:18:25.075982 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.075960865 podStartE2EDuration="6.075960865s" podCreationTimestamp="2026-04-06 12:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:25.0712002 +0000 UTC m=+1284.058943066" watchObservedRunningTime="2026-04-06 12:18:25.075960865 +0000 UTC m=+1284.063703731" Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.061460 4790 generic.go:334] "Generic (PLEG): container finished" podID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerID="8ed8476757b8f5d820a2834f300d930e994e1844e6295ac0b225f0f503479647" exitCode=143 Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.061710 4790 generic.go:334] "Generic (PLEG): container finished" podID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerID="6510e79ef493c973633446db8d83d7f5276ff2b2505be1414b03d32e9d44cf25" exitCode=143 Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.061767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerDied","Data":"8ed8476757b8f5d820a2834f300d930e994e1844e6295ac0b225f0f503479647"} Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.061793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerDied","Data":"6510e79ef493c973633446db8d83d7f5276ff2b2505be1414b03d32e9d44cf25"} Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.076136 4790 generic.go:334] "Generic (PLEG): container finished" podID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerID="d0bb1eac34dd41488f67b9f33d3c19f7cd5eda8351b37f05dade76ab92c5ef05" exitCode=143 Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.076165 4790 generic.go:334] "Generic (PLEG): container finished" podID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerID="5dfeb517b5a198aef4052864be63fb870699e3dee7f662afcfbdaa68af92f878" exitCode=143 Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.076224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerDied","Data":"d0bb1eac34dd41488f67b9f33d3c19f7cd5eda8351b37f05dade76ab92c5ef05"} Apr 06 12:18:26 crc kubenswrapper[4790]: I0406 12:18:26.076278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerDied","Data":"5dfeb517b5a198aef4052864be63fb870699e3dee7f662afcfbdaa68af92f878"} Apr 06 12:18:27 crc kubenswrapper[4790]: I0406 12:18:27.099471 4790 generic.go:334] "Generic (PLEG): container finished" podID="da1a10aa-eab1-477b-bcca-34d62b163b2d" containerID="62d024d1e1a7544111c81c038d7f9c09e5dcaa97d0b467a41e26e87be1ea8fb2" exitCode=0 Apr 06 12:18:27 crc kubenswrapper[4790]: I0406 12:18:27.099713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjzmq" event={"ID":"da1a10aa-eab1-477b-bcca-34d62b163b2d","Type":"ContainerDied","Data":"62d024d1e1a7544111c81c038d7f9c09e5dcaa97d0b467a41e26e87be1ea8fb2"} Apr 06 12:18:27 crc kubenswrapper[4790]: I0406 12:18:27.122598 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.122576869 podStartE2EDuration="8.122576869s" podCreationTimestamp="2026-04-06 12:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:25.101191729 +0000 UTC m=+1284.088934595" watchObservedRunningTime="2026-04-06 12:18:27.122576869 +0000 UTC m=+1286.110319735" Apr 06 12:18:30 crc kubenswrapper[4790]: I0406 12:18:30.641966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:18:30 crc kubenswrapper[4790]: I0406 12:18:30.694746 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:18:30 crc kubenswrapper[4790]: I0406 12:18:30.695045 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" containerID="cri-o://636c57740cffa70333e10685c6600e863f7313a4fa22aaa7ffe80356cccfe54c" gracePeriod=10 Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.551183 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654075 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654391 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654521 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.654993 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq8vx\" (UniqueName: \"kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx\") pod \"da1a10aa-eab1-477b-bcca-34d62b163b2d\" (UID: \"da1a10aa-eab1-477b-bcca-34d62b163b2d\") " Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.666329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx" (OuterVolumeSpecName: "kube-api-access-jq8vx") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "kube-api-access-jq8vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.668237 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.669623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.699996 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts" (OuterVolumeSpecName: "scripts") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.703808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.756799 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq8vx\" (UniqueName: \"kubernetes.io/projected/da1a10aa-eab1-477b-bcca-34d62b163b2d-kube-api-access-jq8vx\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.757096 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.757182 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.757248 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-credential-keys\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.757315 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.764465 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data" (OuterVolumeSpecName: "config-data") pod "da1a10aa-eab1-477b-bcca-34d62b163b2d" (UID: "da1a10aa-eab1-477b-bcca-34d62b163b2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:31 crc kubenswrapper[4790]: I0406 12:18:31.859473 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da1a10aa-eab1-477b-bcca-34d62b163b2d-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.148246 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerID="636c57740cffa70333e10685c6600e863f7313a4fa22aaa7ffe80356cccfe54c" exitCode=0 Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.148316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" event={"ID":"aa1d242d-8c20-4abe-a611-a6ab2e4b1021","Type":"ContainerDied","Data":"636c57740cffa70333e10685c6600e863f7313a4fa22aaa7ffe80356cccfe54c"} Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.151015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjzmq" event={"ID":"da1a10aa-eab1-477b-bcca-34d62b163b2d","Type":"ContainerDied","Data":"80a65c28c80d4be59dc10b6a9675e3a090d40820a858d7917f3f50ee59a435dd"} Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.151039 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80a65c28c80d4be59dc10b6a9675e3a090d40820a858d7917f3f50ee59a435dd" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.151066 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjzmq" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.713036 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rjzmq"] Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.723106 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rjzmq"] Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.821751 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-42m6m"] Apr 06 12:18:32 crc kubenswrapper[4790]: E0406 12:18:32.822249 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1a10aa-eab1-477b-bcca-34d62b163b2d" containerName="keystone-bootstrap" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.822321 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1a10aa-eab1-477b-bcca-34d62b163b2d" containerName="keystone-bootstrap" Apr 06 12:18:32 crc kubenswrapper[4790]: E0406 12:18:32.822353 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b99b17-a37c-48d3-8959-e1abb343e73b" containerName="init" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.822378 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b99b17-a37c-48d3-8959-e1abb343e73b" containerName="init" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.822632 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b99b17-a37c-48d3-8959-e1abb343e73b" containerName="init" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.822655 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1a10aa-eab1-477b-bcca-34d62b163b2d" containerName="keystone-bootstrap" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.824013 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.825741 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.830460 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5jkj" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.830712 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.830935 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.831989 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.835117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42m6m"] Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhw9\" (UniqueName: \"kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985726 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:32 crc kubenswrapper[4790]: I0406 12:18:32.985768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhw9\" (UniqueName: \"kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.087335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.093339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.093522 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.101284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.102383 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.102535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.127641 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhw9\" (UniqueName: \"kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9\") pod \"keystone-bootstrap-42m6m\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.167605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:33 crc kubenswrapper[4790]: I0406 12:18:33.684600 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1a10aa-eab1-477b-bcca-34d62b163b2d" path="/var/lib/kubelet/pods/da1a10aa-eab1-477b-bcca-34d62b163b2d/volumes" Apr 06 12:18:37 crc kubenswrapper[4790]: I0406 12:18:37.810674 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Apr 06 12:18:39 crc kubenswrapper[4790]: I0406 12:18:39.753175 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:18:39 crc kubenswrapper[4790]: I0406 12:18:39.753500 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:18:39 crc kubenswrapper[4790]: I0406 12:18:39.753550 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:18:39 crc kubenswrapper[4790]: I0406 12:18:39.754181 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:18:39 crc kubenswrapper[4790]: I0406 12:18:39.754253 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6" gracePeriod=600 Apr 06 12:18:40 crc kubenswrapper[4790]: I0406 12:18:40.238076 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6" exitCode=0 Apr 06 12:18:40 crc kubenswrapper[4790]: I0406 12:18:40.238121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6"} Apr 06 12:18:40 crc kubenswrapper[4790]: I0406 12:18:40.238162 4790 scope.go:117] "RemoveContainer" containerID="06fc909b8c3bafca8e931d4d1184054009f97417e48b2959e6ed15949f791d43" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.449206 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.457443 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.474070 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.475099 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611148 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rsn9\" (UniqueName: \"kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611251 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp6lv\" (UniqueName: \"kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv\") pod \"ccc30661-4ea6-4218-b75d-b59f473d41bc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611323 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data\") pod \"ccc30661-4ea6-4218-b75d-b59f473d41bc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611344 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611389 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvq5r\" (UniqueName: \"kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611415 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle\") pod \"ccc30661-4ea6-4218-b75d-b59f473d41bc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611464 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611498 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611536 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data\") pod \"ccc30661-4ea6-4218-b75d-b59f473d41bc\" (UID: \"ccc30661-4ea6-4218-b75d-b59f473d41bc\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611638 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs" (OuterVolumeSpecName: "logs") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611671 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611736 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv5x2\" (UniqueName: \"kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611865 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle\") pod \"2c6adc2f-5636-41aa-ac84-2471261f3a19\" (UID: \"2c6adc2f-5636-41aa-ac84-2471261f3a19\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.611975 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.612009 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc\") pod \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\" (UID: \"aa1d242d-8c20-4abe-a611-a6ab2e4b1021\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.612033 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs\") pod \"142e9f6e-49b8-4182-917a-031985b1dfc7\" (UID: \"142e9f6e-49b8-4182-917a-031985b1dfc7\") " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.612232 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs" (OuterVolumeSpecName: "logs") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.612394 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.612411 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.616435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv" (OuterVolumeSpecName: "kube-api-access-xp6lv") pod "ccc30661-4ea6-4218-b75d-b59f473d41bc" (UID: "ccc30661-4ea6-4218-b75d-b59f473d41bc"). InnerVolumeSpecName "kube-api-access-xp6lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.616842 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9" (OuterVolumeSpecName: "kube-api-access-7rsn9") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "kube-api-access-7rsn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.617429 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ccc30661-4ea6-4218-b75d-b59f473d41bc" (UID: "ccc30661-4ea6-4218-b75d-b59f473d41bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.617936 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.617981 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r" (OuterVolumeSpecName: "kube-api-access-vvq5r") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "kube-api-access-vvq5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.618193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.618201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.619046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts" (OuterVolumeSpecName: "scripts") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.620913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts" (OuterVolumeSpecName: "scripts") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.641456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2" (OuterVolumeSpecName: "kube-api-access-lv5x2") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "kube-api-access-lv5x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.642081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.655681 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.688255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.702256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config" (OuterVolumeSpecName: "config") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.704599 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccc30661-4ea6-4218-b75d-b59f473d41bc" (UID: "ccc30661-4ea6-4218-b75d-b59f473d41bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.717049 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c6adc2f-5636-41aa-ac84-2471261f3a19-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.717291 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.717360 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.717449 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.717565 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv5x2\" (UniqueName: \"kubernetes.io/projected/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-kube-api-access-lv5x2\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718002 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/142e9f6e-49b8-4182-917a-031985b1dfc7-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718094 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718166 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718239 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718341 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rsn9\" (UniqueName: \"kubernetes.io/projected/2c6adc2f-5636-41aa-ac84-2471261f3a19-kube-api-access-7rsn9\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718435 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp6lv\" (UniqueName: \"kubernetes.io/projected/ccc30661-4ea6-4218-b75d-b59f473d41bc-kube-api-access-xp6lv\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718539 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718643 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718771 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvq5r\" (UniqueName: \"kubernetes.io/projected/142e9f6e-49b8-4182-917a-031985b1dfc7-kube-api-access-vvq5r\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718949 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.718482 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data" (OuterVolumeSpecName: "config-data") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.721567 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.722426 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.722460 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data" (OuterVolumeSpecName: "config-data") pod "ccc30661-4ea6-4218-b75d-b59f473d41bc" (UID: "ccc30661-4ea6-4218-b75d-b59f473d41bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.724581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.731143 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.731767 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data" (OuterVolumeSpecName: "config-data") pod "142e9f6e-49b8-4182-917a-031985b1dfc7" (UID: "142e9f6e-49b8-4182-917a-031985b1dfc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.741631 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.742477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa1d242d-8c20-4abe-a611-a6ab2e4b1021" (UID: "aa1d242d-8c20-4abe-a611-a6ab2e4b1021"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.744397 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.753930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2c6adc2f-5636-41aa-ac84-2471261f3a19" (UID: "2c6adc2f-5636-41aa-ac84-2471261f3a19"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.811554 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821118 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821152 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142e9f6e-49b8-4182-917a-031985b1dfc7-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821161 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821170 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821181 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821190 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821198 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc30661-4ea6-4218-b75d-b59f473d41bc-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821207 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa1d242d-8c20-4abe-a611-a6ab2e4b1021-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821238 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821246 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:42 crc kubenswrapper[4790]: I0406 12:18:42.821254 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6adc2f-5636-41aa-ac84-2471261f3a19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.263110 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"142e9f6e-49b8-4182-917a-031985b1dfc7","Type":"ContainerDied","Data":"5c84633c2a64c3ee9401584316d65cf68ac29f0a358155e6c7d0ba987b62c81e"} Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.263148 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.266139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2c6adc2f-5636-41aa-ac84-2471261f3a19","Type":"ContainerDied","Data":"f13e1aacf015d12009c46ec73b48392d438e3d0ca2cf01d3d43bebc4e71d8532"} Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.266173 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.267940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8r6lc" event={"ID":"ccc30661-4ea6-4218-b75d-b59f473d41bc","Type":"ContainerDied","Data":"c9164921f59172ba03f91c9c8c8549499bb016d490e45a77b50fe03feaa95c85"} Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.267968 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9164921f59172ba03f91c9c8c8549499bb016d490e45a77b50fe03feaa95c85" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.267971 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8r6lc" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.270856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" event={"ID":"aa1d242d-8c20-4abe-a611-a6ab2e4b1021","Type":"ContainerDied","Data":"dda4b6d404d91ae43f62b568466cef8a4bef351a089855786209f4ca7aedd13a"} Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.270923 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccfc598b5-7ftkm" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.372886 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.387642 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408066 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408429 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc30661-4ea6-4218-b75d-b59f473d41bc" containerName="watcher-db-sync" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408445 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc30661-4ea6-4218-b75d-b59f473d41bc" containerName="watcher-db-sync" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408475 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="init" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408483 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="init" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408495 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408502 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408513 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408519 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408537 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408553 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408561 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.408574 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408581 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408756 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408782 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-log" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408797 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408809 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" containerName="glance-httpd" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408843 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc30661-4ea6-4218-b75d-b59f473d41bc" containerName="watcher-db-sync" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.408860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" containerName="dnsmasq-dns" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.409779 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.415649 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.416060 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4xzjn" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.416193 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.416430 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.419000 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.434454 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccfc598b5-7ftkm"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.444806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.455907 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.466349 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.480132 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.482256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.484222 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.499156 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.499928 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.530870 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.530934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.530961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gjs\" (UniqueName: \"kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.531086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.531152 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.531201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.531266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.531302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632493 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632760 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gjs\" (UniqueName: \"kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632793 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632815 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9zp\" (UniqueName: \"kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.632902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.633253 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.633519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.633632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.633666 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.634171 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.634535 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.638867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.640233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.642744 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.642799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.644849 4790 scope.go:117] "RemoveContainer" containerID="d0bb1eac34dd41488f67b9f33d3c19f7cd5eda8351b37f05dade76ab92c5ef05" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.684031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gjs\" (UniqueName: \"kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.708767 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.712648 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142e9f6e-49b8-4182-917a-031985b1dfc7" path="/var/lib/kubelet/pods/142e9f6e-49b8-4182-917a-031985b1dfc7/volumes" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.713708 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6adc2f-5636-41aa-ac84-2471261f3a19" path="/var/lib/kubelet/pods/2c6adc2f-5636-41aa-ac84-2471261f3a19/volumes" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.715773 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1d242d-8c20-4abe-a611-a6ab2e4b1021" path="/var/lib/kubelet/pods/aa1d242d-8c20-4abe-a611-a6ab2e4b1021/volumes" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.728307 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736145 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736240 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9zp\" (UniqueName: \"kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.736433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.737072 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.740452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.741164 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.741898 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.753262 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.758207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.765057 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.773938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.779687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.782715 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.782839 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-g7jzm" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.783070 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.805027 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9zp\" (UniqueName: \"kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.811026 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.811075 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.94:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.812535 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.94:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfsn4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6crt4_openstack(25adb66a-8db5-48e1-b05c-526008a22e4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.815444 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: E0406 12:18:43.817154 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6crt4" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.831663 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.833357 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.908685 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.926276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.932818 4790 scope.go:117] "RemoveContainer" containerID="5dfeb517b5a198aef4052864be63fb870699e3dee7f662afcfbdaa68af92f878" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.947193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbwr\" (UniqueName: \"kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.947449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949635 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlcxr\" (UniqueName: \"kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.949944 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.970699 4790 scope.go:117] "RemoveContainer" containerID="8ed8476757b8f5d820a2834f300d930e994e1844e6295ac0b225f0f503479647" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.983178 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.984447 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.985952 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Apr 06 12:18:43 crc kubenswrapper[4790]: I0406 12:18:43.996351 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.022343 4790 scope.go:117] "RemoveContainer" containerID="6510e79ef493c973633446db8d83d7f5276ff2b2505be1414b03d32e9d44cf25" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.051663 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052101 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052131 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052178 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hsp4\" (UniqueName: \"kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlcxr\" (UniqueName: \"kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052530 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.052658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbwr\" (UniqueName: \"kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.053215 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.053773 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.057013 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.059046 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.060496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.062974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.063818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.098230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbwr\" (UniqueName: \"kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr\") pod \"watcher-api-0\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.105848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.111489 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlcxr\" (UniqueName: \"kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr\") pod \"watcher-applier-0\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.113053 4790 scope.go:117] "RemoveContainer" containerID="636c57740cffa70333e10685c6600e863f7313a4fa22aaa7ffe80356cccfe54c" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.154847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.154926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hsp4\" (UniqueName: \"kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.154959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.154983 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.155049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.160522 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.172022 4790 scope.go:117] "RemoveContainer" containerID="c6344137b881df201d9237ec232e7142aecdf0bffea3a8ecdfaa3d07d35b4620" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.172360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.172457 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.172975 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.176567 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hsp4\" (UniqueName: \"kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4\") pod \"watcher-decision-engine-0\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.179701 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.219648 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.286694 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r77xt" event={"ID":"60672a42-148c-45e2-99a8-da1ebec40dbb","Type":"ContainerStarted","Data":"0587a7d5856dbe85e53580dd53de6f5db18363f7057a531df9afa2d63d5134a8"} Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.317065 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.331274 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103"} Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.331283 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-r77xt" podStartSLOduration=1.956432537 podStartE2EDuration="24.331269947s" podCreationTimestamp="2026-04-06 12:18:20 +0000 UTC" firstStartedPulling="2026-04-06 12:18:21.311578561 +0000 UTC m=+1280.299321427" lastFinishedPulling="2026-04-06 12:18:43.686415961 +0000 UTC m=+1302.674158837" observedRunningTime="2026-04-06 12:18:44.325289826 +0000 UTC m=+1303.313032692" watchObservedRunningTime="2026-04-06 12:18:44.331269947 +0000 UTC m=+1303.319012813" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.341474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h5v54" event={"ID":"e4c585de-2005-495a-a987-8cfe70dc8793","Type":"ContainerStarted","Data":"c473e7b7241a513ca98ed37ddc30240de019d042f4f717ca35bc4e4c51833b0a"} Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.405205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerStarted","Data":"e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298"} Apr 06 12:18:44 crc kubenswrapper[4790]: E0406 12:18:44.419422 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.94:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-6crt4" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.435376 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-h5v54" podStartSLOduration=2.089038981 podStartE2EDuration="24.435360952s" podCreationTimestamp="2026-04-06 12:18:20 +0000 UTC" firstStartedPulling="2026-04-06 12:18:21.30964172 +0000 UTC m=+1280.297384586" lastFinishedPulling="2026-04-06 12:18:43.655963681 +0000 UTC m=+1302.643706557" observedRunningTime="2026-04-06 12:18:44.386160576 +0000 UTC m=+1303.373903432" watchObservedRunningTime="2026-04-06 12:18:44.435360952 +0000 UTC m=+1303.423103818" Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.476935 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-42m6m"] Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.602356 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:18:44 crc kubenswrapper[4790]: I0406 12:18:44.789727 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.052228 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.126192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.134453 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:18:45 crc kubenswrapper[4790]: W0406 12:18:45.188533 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b5858b3_3d3c_4289_bd5d_79c34c777cfd.slice/crio-ecbc051ce53108c3ee1d425df8b86c8d52022f8a4effffac4c891c7169901c4c WatchSource:0}: Error finding container ecbc051ce53108c3ee1d425df8b86c8d52022f8a4effffac4c891c7169901c4c: Status 404 returned error can't find the container with id ecbc051ce53108c3ee1d425df8b86c8d52022f8a4effffac4c891c7169901c4c Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.423335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42m6m" event={"ID":"570ffea2-75f1-4831-98d3-50b826b1e37d","Type":"ContainerStarted","Data":"5806040a576ff1fbeb726114ee548b4e7f93b9bb2d086feba510bba5c41bbea5"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.423655 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42m6m" event={"ID":"570ffea2-75f1-4831-98d3-50b826b1e37d","Type":"ContainerStarted","Data":"86d395dd1866234cafd384ba1f39ae1b6e659dbb3a42004a7411fc10ffe9efdd"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.427602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerStarted","Data":"ecbc051ce53108c3ee1d425df8b86c8d52022f8a4effffac4c891c7169901c4c"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.429707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b60a0169-fd9c-4677-92e1-a09453bea104","Type":"ContainerStarted","Data":"2ff8ddf6ef5d52fcb2fc99c1215c9e47621bfd4aad0de8da6953b85be94ee520"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.431635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerStarted","Data":"2fdfeeb179d32149d9c703778f94b6f3b341ac938354cfd03ba5bd84ed79fbb1"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.446443 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-42m6m" podStartSLOduration=13.446424658 podStartE2EDuration="13.446424658s" podCreationTimestamp="2026-04-06 12:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:45.440175289 +0000 UTC m=+1304.427918145" watchObservedRunningTime="2026-04-06 12:18:45.446424658 +0000 UTC m=+1304.434167524" Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.447173 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerStarted","Data":"c448c3fb360b405254cdd2a776c58a63e6c644902dc0011b721e71be59960e46"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.447206 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerStarted","Data":"83288a99ab095dbe7b467050cf92654092fe49129052c4e8984bfc355883af90"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.447216 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerStarted","Data":"bd7e2e816f7f7d7700c0256e1213b1291499a151ee074b2625ed01ab8b6511df"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.447519 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.449763 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.181:9322/\": dial tcp 10.217.0.181:9322: connect: connection refused" Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.453460 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"9e2e46c72f21b2c5702c75e5df201bc0002b8c687efef600f4cdddb10bddda76"} Apr 06 12:18:45 crc kubenswrapper[4790]: I0406 12:18:45.474929 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.474912695 podStartE2EDuration="2.474912695s" podCreationTimestamp="2026-04-06 12:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:45.469605272 +0000 UTC m=+1304.457348138" watchObservedRunningTime="2026-04-06 12:18:45.474912695 +0000 UTC m=+1304.462655561" Apr 06 12:18:46 crc kubenswrapper[4790]: I0406 12:18:46.468909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerStarted","Data":"99c93e387d9924d9746b1f6d7dc72e0ec9e049b5f0e791e086154c30a3dc4aa2"} Apr 06 12:18:46 crc kubenswrapper[4790]: I0406 12:18:46.470720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerStarted","Data":"bc308ee8c4bf787e7b9b55ae190a8a8be2d4ffcd98f5b0f51c3293eb263df632"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.490219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerStarted","Data":"f13b247bd3378f934a4183f4ff128203e79a5ee93c4a41b1b1c4cbc535784f07"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.491793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b60a0169-fd9c-4677-92e1-a09453bea104","Type":"ContainerStarted","Data":"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.494976 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerStarted","Data":"e8120589c7be71615c7d37a940f2c82811c6f3ddfa72a74d0e492cae992d19f7"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.497149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerStarted","Data":"eb3ed014da04a04e2d7acb6c57264f86b56aac0d3e2c77e1a3d9f84516c80a34"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.499180 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"e894173738cccc6cbb9d1d32f79ff33799febb4b16a5bb2a1692363023807b22"} Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.527167 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.527149904 podStartE2EDuration="5.527149904s" podCreationTimestamp="2026-04-06 12:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:48.515693835 +0000 UTC m=+1307.503436701" watchObservedRunningTime="2026-04-06 12:18:48.527149904 +0000 UTC m=+1307.514892770" Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.537214 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.537191585 podStartE2EDuration="5.537191585s" podCreationTimestamp="2026-04-06 12:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:48.535135439 +0000 UTC m=+1307.522878305" watchObservedRunningTime="2026-04-06 12:18:48.537191585 +0000 UTC m=+1307.524934451" Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.562911 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.427642528 podStartE2EDuration="5.562893827s" podCreationTimestamp="2026-04-06 12:18:43 +0000 UTC" firstStartedPulling="2026-04-06 12:18:45.169712121 +0000 UTC m=+1304.157454987" lastFinishedPulling="2026-04-06 12:18:47.30496341 +0000 UTC m=+1306.292706286" observedRunningTime="2026-04-06 12:18:48.551437779 +0000 UTC m=+1307.539180655" watchObservedRunningTime="2026-04-06 12:18:48.562893827 +0000 UTC m=+1307.550636693" Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.573311 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.329927555 podStartE2EDuration="5.573291387s" podCreationTimestamp="2026-04-06 12:18:43 +0000 UTC" firstStartedPulling="2026-04-06 12:18:45.052912894 +0000 UTC m=+1304.040655760" lastFinishedPulling="2026-04-06 12:18:47.296276726 +0000 UTC m=+1306.284019592" observedRunningTime="2026-04-06 12:18:48.565805216 +0000 UTC m=+1307.553548082" watchObservedRunningTime="2026-04-06 12:18:48.573291387 +0000 UTC m=+1307.561034253" Apr 06 12:18:48 crc kubenswrapper[4790]: I0406 12:18:48.878384 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.180533 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.220655 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.514115 4790 generic.go:334] "Generic (PLEG): container finished" podID="570ffea2-75f1-4831-98d3-50b826b1e37d" containerID="5806040a576ff1fbeb726114ee548b4e7f93b9bb2d086feba510bba5c41bbea5" exitCode=0 Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.514219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42m6m" event={"ID":"570ffea2-75f1-4831-98d3-50b826b1e37d","Type":"ContainerDied","Data":"5806040a576ff1fbeb726114ee548b4e7f93b9bb2d086feba510bba5c41bbea5"} Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.516791 4790 generic.go:334] "Generic (PLEG): container finished" podID="60672a42-148c-45e2-99a8-da1ebec40dbb" containerID="0587a7d5856dbe85e53580dd53de6f5db18363f7057a531df9afa2d63d5134a8" exitCode=0 Apr 06 12:18:49 crc kubenswrapper[4790]: I0406 12:18:49.516858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r77xt" event={"ID":"60672a42-148c-45e2-99a8-da1ebec40dbb","Type":"ContainerDied","Data":"0587a7d5856dbe85e53580dd53de6f5db18363f7057a531df9afa2d63d5134a8"} Apr 06 12:18:50 crc kubenswrapper[4790]: I0406 12:18:50.528744 4790 generic.go:334] "Generic (PLEG): container finished" podID="e4c585de-2005-495a-a987-8cfe70dc8793" containerID="c473e7b7241a513ca98ed37ddc30240de019d042f4f717ca35bc4e4c51833b0a" exitCode=0 Apr 06 12:18:50 crc kubenswrapper[4790]: I0406 12:18:50.529071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h5v54" event={"ID":"e4c585de-2005-495a-a987-8cfe70dc8793","Type":"ContainerDied","Data":"c473e7b7241a513ca98ed37ddc30240de019d042f4f717ca35bc4e4c51833b0a"} Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.540719 4790 generic.go:334] "Generic (PLEG): container finished" podID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerID="e894173738cccc6cbb9d1d32f79ff33799febb4b16a5bb2a1692363023807b22" exitCode=1 Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.540860 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"e894173738cccc6cbb9d1d32f79ff33799febb4b16a5bb2a1692363023807b22"} Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.541667 4790 scope.go:117] "RemoveContainer" containerID="e894173738cccc6cbb9d1d32f79ff33799febb4b16a5bb2a1692363023807b22" Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.545360 4790 generic.go:334] "Generic (PLEG): container finished" podID="e752bf76-0eff-4559-b599-6ab462cea81e" containerID="d13452abb359ca2338c1acedc317f38a7cf7097878ef34b2eaa3d8722957c828" exitCode=0 Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.545411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wl5rj" event={"ID":"e752bf76-0eff-4559-b599-6ab462cea81e","Type":"ContainerDied","Data":"d13452abb359ca2338c1acedc317f38a7cf7097878ef34b2eaa3d8722957c828"} Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.949341 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.984652 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:51 crc kubenswrapper[4790]: I0406 12:18:51.995248 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.040717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data\") pod \"e4c585de-2005-495a-a987-8cfe70dc8793\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.052877 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e4c585de-2005-495a-a987-8cfe70dc8793" (UID: "e4c585de-2005-495a-a987-8cfe70dc8793"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143159 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnhw9\" (UniqueName: \"kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data\") pod \"60672a42-148c-45e2-99a8-da1ebec40dbb\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143272 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvngs\" (UniqueName: \"kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs\") pod \"e4c585de-2005-495a-a987-8cfe70dc8793\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle\") pod \"60672a42-148c-45e2-99a8-da1ebec40dbb\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h\") pod \"60672a42-148c-45e2-99a8-da1ebec40dbb\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143360 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts\") pod \"60672a42-148c-45e2-99a8-da1ebec40dbb\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143376 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143415 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143439 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle\") pod \"e4c585de-2005-495a-a987-8cfe70dc8793\" (UID: \"e4c585de-2005-495a-a987-8cfe70dc8793\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys\") pod \"570ffea2-75f1-4831-98d3-50b826b1e37d\" (UID: \"570ffea2-75f1-4831-98d3-50b826b1e37d\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143501 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs\") pod \"60672a42-148c-45e2-99a8-da1ebec40dbb\" (UID: \"60672a42-148c-45e2-99a8-da1ebec40dbb\") " Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.143724 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.144323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs" (OuterVolumeSpecName: "logs") pod "60672a42-148c-45e2-99a8-da1ebec40dbb" (UID: "60672a42-148c-45e2-99a8-da1ebec40dbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.150427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.151034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs" (OuterVolumeSpecName: "kube-api-access-zvngs") pod "e4c585de-2005-495a-a987-8cfe70dc8793" (UID: "e4c585de-2005-495a-a987-8cfe70dc8793"). InnerVolumeSpecName "kube-api-access-zvngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.151701 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.151785 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h" (OuterVolumeSpecName: "kube-api-access-ff42h") pod "60672a42-148c-45e2-99a8-da1ebec40dbb" (UID: "60672a42-148c-45e2-99a8-da1ebec40dbb"). InnerVolumeSpecName "kube-api-access-ff42h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.151965 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9" (OuterVolumeSpecName: "kube-api-access-rnhw9") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "kube-api-access-rnhw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.153080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts" (OuterVolumeSpecName: "scripts") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.153851 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts" (OuterVolumeSpecName: "scripts") pod "60672a42-148c-45e2-99a8-da1ebec40dbb" (UID: "60672a42-148c-45e2-99a8-da1ebec40dbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.173043 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data" (OuterVolumeSpecName: "config-data") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.173912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "570ffea2-75f1-4831-98d3-50b826b1e37d" (UID: "570ffea2-75f1-4831-98d3-50b826b1e37d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.174009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60672a42-148c-45e2-99a8-da1ebec40dbb" (UID: "60672a42-148c-45e2-99a8-da1ebec40dbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.174207 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data" (OuterVolumeSpecName: "config-data") pod "60672a42-148c-45e2-99a8-da1ebec40dbb" (UID: "60672a42-148c-45e2-99a8-da1ebec40dbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.181940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4c585de-2005-495a-a987-8cfe70dc8793" (UID: "e4c585de-2005-495a-a987-8cfe70dc8793"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244688 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-credential-keys\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244728 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvngs\" (UniqueName: \"kubernetes.io/projected/e4c585de-2005-495a-a987-8cfe70dc8793-kube-api-access-zvngs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244743 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244753 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff42h\" (UniqueName: \"kubernetes.io/projected/60672a42-148c-45e2-99a8-da1ebec40dbb-kube-api-access-ff42h\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244765 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244775 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244785 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244795 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c585de-2005-495a-a987-8cfe70dc8793-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244805 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244817 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60672a42-148c-45e2-99a8-da1ebec40dbb-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244847 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnhw9\" (UniqueName: \"kubernetes.io/projected/570ffea2-75f1-4831-98d3-50b826b1e37d-kube-api-access-rnhw9\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244858 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60672a42-148c-45e2-99a8-da1ebec40dbb-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.244868 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570ffea2-75f1-4831-98d3-50b826b1e37d-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.558207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerStarted","Data":"c9aa44481bb148c149e9f2ec055ff375c2817cad5ce4db45971c08d7612d5b6c"} Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.561486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f"} Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.573969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-42m6m" event={"ID":"570ffea2-75f1-4831-98d3-50b826b1e37d","Type":"ContainerDied","Data":"86d395dd1866234cafd384ba1f39ae1b6e659dbb3a42004a7411fc10ffe9efdd"} Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.574165 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d395dd1866234cafd384ba1f39ae1b6e659dbb3a42004a7411fc10ffe9efdd" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.574377 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-42m6m" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.579040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-r77xt" event={"ID":"60672a42-148c-45e2-99a8-da1ebec40dbb","Type":"ContainerDied","Data":"70b156159aadd855633ec58ce7c4ea54bb3575eff1c4d04be6839a5943f18f5e"} Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.579080 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b156159aadd855633ec58ce7c4ea54bb3575eff1c4d04be6839a5943f18f5e" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.579156 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-r77xt" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.595545 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-h5v54" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.596477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-h5v54" event={"ID":"e4c585de-2005-495a-a987-8cfe70dc8793","Type":"ContainerDied","Data":"7ea73b328ae601d4ef9dc45b661d9a07281dacfc17ee4c3723d7e348a1960c8f"} Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.597419 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea73b328ae601d4ef9dc45b661d9a07281dacfc17ee4c3723d7e348a1960c8f" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.902187 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:52 crc kubenswrapper[4790]: E0406 12:18:52.902853 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570ffea2-75f1-4831-98d3-50b826b1e37d" containerName="keystone-bootstrap" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.902869 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="570ffea2-75f1-4831-98d3-50b826b1e37d" containerName="keystone-bootstrap" Apr 06 12:18:52 crc kubenswrapper[4790]: E0406 12:18:52.902891 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c585de-2005-495a-a987-8cfe70dc8793" containerName="barbican-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.902898 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c585de-2005-495a-a987-8cfe70dc8793" containerName="barbican-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: E0406 12:18:52.902907 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60672a42-148c-45e2-99a8-da1ebec40dbb" containerName="placement-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.902914 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60672a42-148c-45e2-99a8-da1ebec40dbb" containerName="placement-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.903073 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="60672a42-148c-45e2-99a8-da1ebec40dbb" containerName="placement-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.903122 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="570ffea2-75f1-4831-98d3-50b826b1e37d" containerName="keystone-bootstrap" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.903135 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c585de-2005-495a-a987-8cfe70dc8793" containerName="barbican-db-sync" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.904981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957736 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94x5t\" (UniqueName: \"kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957805 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.957850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.959562 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.960983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.967965 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.968128 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.968222 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-sqxqr" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.984008 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.985585 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:52 crc kubenswrapper[4790]: I0406 12:18:52.986959 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.008338 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.019657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.049300 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.053586 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.059899 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.059955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.060024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.060074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94x5t\" (UniqueName: \"kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.060098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.060132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.060973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.061004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.061287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.061839 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.062321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.098614 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94x5t\" (UniqueName: \"kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t\") pod \"dnsmasq-dns-79975678d5-blp5h\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.168377 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config\") pod \"e752bf76-0eff-4559-b599-6ab462cea81e\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.168554 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8j8c\" (UniqueName: \"kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c\") pod \"e752bf76-0eff-4559-b599-6ab462cea81e\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.168577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle\") pod \"e752bf76-0eff-4559-b599-6ab462cea81e\" (UID: \"e752bf76-0eff-4559-b599-6ab462cea81e\") " Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179296 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4bb\" (UniqueName: \"kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179387 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179818 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.179939 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r224v\" (UniqueName: \"kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.180003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.180382 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c" (OuterVolumeSpecName: "kube-api-access-d8j8c") pod "e752bf76-0eff-4559-b599-6ab462cea81e" (UID: "e752bf76-0eff-4559-b599-6ab462cea81e"). InnerVolumeSpecName "kube-api-access-d8j8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.199464 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:18:53 crc kubenswrapper[4790]: E0406 12:18:53.199908 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e752bf76-0eff-4559-b599-6ab462cea81e" containerName="neutron-db-sync" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.199924 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e752bf76-0eff-4559-b599-6ab462cea81e" containerName="neutron-db-sync" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.200094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e752bf76-0eff-4559-b599-6ab462cea81e" containerName="neutron-db-sync" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.201237 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.203450 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.221286 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.223972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.228511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.229139 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qr248" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.229302 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.234376 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.235228 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.235400 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.240620 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e752bf76-0eff-4559-b599-6ab462cea81e" (UID: "e752bf76-0eff-4559-b599-6ab462cea81e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.247457 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b9b7c8b58-lzkxg"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.249094 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.258110 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.261243 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.262337 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.262479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5jkj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.265960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.267061 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.267219 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config" (OuterVolumeSpecName: "config") pod "e752bf76-0eff-4559-b599-6ab462cea81e" (UID: "e752bf76-0eff-4559-b599-6ab462cea81e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.297716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.302694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.302928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.303405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.303654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qtt\" (UniqueName: \"kubernetes.io/projected/b215a7cd-f428-4fbd-adbc-307b6c905894-kube-api-access-p5qtt\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.303874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.304917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-public-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.304997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305057 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-config-data\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305088 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-credential-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-internal-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-fernet-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305190 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-combined-ca-bundle\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r224v\" (UniqueName: \"kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305427 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305452 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzksp\" (UniqueName: \"kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305490 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrl2\" (UniqueName: \"kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-scripts\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.305717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4bb\" (UniqueName: \"kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.304599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.306332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.308621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8j8c\" (UniqueName: \"kubernetes.io/projected/e752bf76-0eff-4559-b599-6ab462cea81e-kube-api-access-d8j8c\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.309263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.309500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.310250 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.310276 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e752bf76-0eff-4559-b599-6ab462cea81e-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.315092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.320450 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.320597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.327956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.352736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r224v\" (UniqueName: \"kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v\") pod \"barbican-worker-67668d4bd9-69x7l\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.357203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4bb\" (UniqueName: \"kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb\") pod \"barbican-keystone-listener-78f48bbdb8-86kwv\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.385316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.393562 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.394113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-config-data\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-credential-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412187 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-internal-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412221 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-fernet-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-combined-ca-bundle\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412290 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzksp\" (UniqueName: \"kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjrl2\" (UniqueName: \"kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-scripts\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412511 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qtt\" (UniqueName: \"kubernetes.io/projected/b215a7cd-f428-4fbd-adbc-307b6c905894-kube-api-access-p5qtt\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.412534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-public-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.419508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-config-data\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.421462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-public-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.424133 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.432131 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-internal-tls-certs\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.433245 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.433624 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9b7c8b58-lzkxg"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.433951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-credential-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.434001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.435258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.435272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-scripts\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.436615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-combined-ca-bundle\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.437057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzksp\" (UniqueName: \"kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.437071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.438495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.449857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle\") pod \"barbican-api-d77d6889d-9r858\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.454437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjrl2\" (UniqueName: \"kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.455741 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.457040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qtt\" (UniqueName: \"kubernetes.io/projected/b215a7cd-f428-4fbd-adbc-307b6c905894-kube-api-access-p5qtt\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.466176 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.478660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b215a7cd-f428-4fbd-adbc-307b6c905894-fernet-keys\") pod \"keystone-b9b7c8b58-lzkxg\" (UID: \"b215a7cd-f428-4fbd-adbc-307b6c905894\") " pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.479097 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.492058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle\") pod \"placement-5f66dc7c8d-wd7td\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.561895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.562518 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d568b5b57-5x8cp"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.564178 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.574706 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7cbdd57f54-rd6dk"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.579134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.583291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.598582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cbdd57f54-rd6dk"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.598619 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d568b5b57-5x8cp"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.602906 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.610331 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.612617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.621971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdh2t\" (UniqueName: \"kubernetes.io/projected/d431242a-f2f0-4780-85d0-9f2cfc8573ac-kube-api-access-hdh2t\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622153 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d431242a-f2f0-4780-85d0-9f2cfc8573ac-logs\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622265 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc26q\" (UniqueName: \"kubernetes.io/projected/2f7309c8-fdde-4e0e-9efa-ece286501ec5-kube-api-access-pc26q\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data-custom\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.622819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-combined-ca-bundle\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.623138 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data-custom\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.623222 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7309c8-fdde-4e0e-9efa-ece286501ec5-logs\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.647785 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.667450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wl5rj" event={"ID":"e752bf76-0eff-4559-b599-6ab462cea81e","Type":"ContainerDied","Data":"076d1c811a47ba793b763df05d98e515abe6005ebf3b28df185dce0f6b3928de"} Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.667696 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076d1c811a47ba793b763df05d98e515abe6005ebf3b28df185dce0f6b3928de" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.667729 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.724776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szvf\" (UniqueName: \"kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.724858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.724911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.724952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-combined-ca-bundle\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.724972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725007 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data-custom\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7309c8-fdde-4e0e-9efa-ece286501ec5-logs\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdh2t\" (UniqueName: \"kubernetes.io/projected/d431242a-f2f0-4780-85d0-9f2cfc8573ac-kube-api-access-hdh2t\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d431242a-f2f0-4780-85d0-9f2cfc8573ac-logs\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725178 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc26q\" (UniqueName: \"kubernetes.io/projected/2f7309c8-fdde-4e0e-9efa-ece286501ec5-kube-api-access-pc26q\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.725259 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data-custom\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.729040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7309c8-fdde-4e0e-9efa-ece286501ec5-logs\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.732889 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d431242a-f2f0-4780-85d0-9f2cfc8573ac-logs\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.751956 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7bffc46f4d-tqbdl"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.753666 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data-custom\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.757008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-combined-ca-bundle\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.757016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.758045 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d431242a-f2f0-4780-85d0-9f2cfc8573ac-config-data\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.760730 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bffc46f4d-tqbdl"] Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.760775 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.760791 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.761100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.762060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-combined-ca-bundle\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.762601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdh2t\" (UniqueName: \"kubernetes.io/projected/d431242a-f2f0-4780-85d0-9f2cfc8573ac-kube-api-access-hdh2t\") pod \"barbican-keystone-listener-7cbdd57f54-rd6dk\" (UID: \"d431242a-f2f0-4780-85d0-9f2cfc8573ac\") " pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.763055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7309c8-fdde-4e0e-9efa-ece286501ec5-config-data-custom\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.803757 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc26q\" (UniqueName: \"kubernetes.io/projected/2f7309c8-fdde-4e0e-9efa-ece286501ec5-kube-api-access-pc26q\") pod \"barbican-worker-6d568b5b57-5x8cp\" (UID: \"2f7309c8-fdde-4e0e-9efa-ece286501ec5\") " pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.828496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.828543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.828598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szvf\" (UniqueName: \"kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.828657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.828682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.836106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.844747 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.848191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.849378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.876204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szvf\" (UniqueName: \"kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf\") pod \"barbican-api-7b48866f46-rtgzj\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.891417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-logs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-public-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-internal-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-combined-ca-bundle\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-scripts\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-config-data\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.931780 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk67l\" (UniqueName: \"kubernetes.io/projected/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-kube-api-access-zk67l\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.949538 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.956131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d568b5b57-5x8cp" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.978966 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" Apr 06 12:18:53 crc kubenswrapper[4790]: I0406 12:18:53.988995 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.018472 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.033510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-scripts\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.039658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-config-data\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.039968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk67l\" (UniqueName: \"kubernetes.io/projected/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-kube-api-access-zk67l\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.040390 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-logs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.040532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-public-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.037898 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.040821 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-internal-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.040937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-combined-ca-bundle\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.041284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-logs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.057692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-public-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.058813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-combined-ca-bundle\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.062574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-config-data\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.069790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-scripts\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.077946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk67l\" (UniqueName: \"kubernetes.io/projected/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-kube-api-access-zk67l\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.082271 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.083021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54d90c86-6e3b-49d3-a50f-eefe94ef8d6d-internal-tls-certs\") pod \"placement-7bffc46f4d-tqbdl\" (UID: \"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d\") " pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.095169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.110577 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.111468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.181499 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.222183 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.223892 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.229420 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.229670 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.250917 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.263634 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9ln\" (UniqueName: \"kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.263792 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.264084 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.264113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.264136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.264228 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.306363 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.310436 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.315624 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.315905 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.316113 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.316285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v8r6x" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.317924 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366161 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366191 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj9ln\" (UniqueName: \"kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.366342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.367530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.368636 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.369332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.376032 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.377461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.380519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.391123 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.391885 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.400710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj9ln\" (UniqueName: \"kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln\") pod \"dnsmasq-dns-5467468dd7-fc25g\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.424484 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.445023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.468697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.468787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.468854 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdjs\" (UniqueName: \"kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.468973 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.468991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.521013 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.581137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.581195 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.581223 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.581266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.581353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdjs\" (UniqueName: \"kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.585937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.590070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.590088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.591476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: W0406 12:18:54.606713 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cf13bc8_7964_446c_8b42_f52e62da1ded.slice/crio-806288bce679a35ae5d8dae90f91929bc13e97a8b71eb078b250a8768b1cb64d WatchSource:0}: Error finding container 806288bce679a35ae5d8dae90f91929bc13e97a8b71eb078b250a8768b1cb64d: Status 404 returned error can't find the container with id 806288bce679a35ae5d8dae90f91929bc13e97a8b71eb078b250a8768b1cb64d Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.610028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdjs\" (UniqueName: \"kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs\") pod \"neutron-5fb546dccb-rmb8d\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.614878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.638694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.700272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerStarted","Data":"314ead17ba5edf2d8a905ff75d594dbb04b38d259150e7bdff190d26dcb2a9e2"} Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.709071 4790 generic.go:334] "Generic (PLEG): container finished" podID="0595f37c-6776-4055-833b-4d436c73604b" containerID="863050c8fb69a181a0a702f9c8239e08c9e35319e510db93acc7b6a08e460cfd" exitCode=0 Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.709169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79975678d5-blp5h" event={"ID":"0595f37c-6776-4055-833b-4d436c73604b","Type":"ContainerDied","Data":"863050c8fb69a181a0a702f9c8239e08c9e35319e510db93acc7b6a08e460cfd"} Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.709199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79975678d5-blp5h" event={"ID":"0595f37c-6776-4055-833b-4d436c73604b","Type":"ContainerStarted","Data":"dd09e5787ecc2c61b8d022a9b438cc0975bc83ed2795d6fc5de52901957291b1"} Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.727320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerStarted","Data":"806288bce679a35ae5d8dae90f91929bc13e97a8b71eb078b250a8768b1cb64d"} Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.752737 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.752782 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.752795 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.752805 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.752817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.941477 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.964488 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Apr 06 12:18:54 crc kubenswrapper[4790]: I0406 12:18:54.991516 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.019341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9b7c8b58-lzkxg"] Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.053455 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:18:55 crc kubenswrapper[4790]: W0406 12:18:55.074167 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18f1dd4f_19c2_4cdb_a40b_6e1295700cbd.slice/crio-485574a5d1c8d548c1e386acfa5d4a1c641a2c2bbac3063926d71e3ac2acb7f0 WatchSource:0}: Error finding container 485574a5d1c8d548c1e386acfa5d4a1c641a2c2bbac3063926d71e3ac2acb7f0: Status 404 returned error can't find the container with id 485574a5d1c8d548c1e386acfa5d4a1c641a2c2bbac3063926d71e3ac2acb7f0 Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.143947 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.176672 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d568b5b57-5x8cp"] Apr 06 12:18:55 crc kubenswrapper[4790]: W0406 12:18:55.186457 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7dbca29_a7f8_4547_9deb_75ee69b773f9.slice/crio-8bee7bdfc53f332eb85abe56936d04fee68c34bcd67a1ed24cbb2f8ea6549835 WatchSource:0}: Error finding container 8bee7bdfc53f332eb85abe56936d04fee68c34bcd67a1ed24cbb2f8ea6549835: Status 404 returned error can't find the container with id 8bee7bdfc53f332eb85abe56936d04fee68c34bcd67a1ed24cbb2f8ea6549835 Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.442685 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:18:55 crc kubenswrapper[4790]: W0406 12:18:55.493427 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54d90c86_6e3b_49d3_a50f_eefe94ef8d6d.slice/crio-c894b3ba45e24532e3906d5f39fb10207668cc5b857c5710c213781d43925384 WatchSource:0}: Error finding container c894b3ba45e24532e3906d5f39fb10207668cc5b857c5710c213781d43925384: Status 404 returned error can't find the container with id c894b3ba45e24532e3906d5f39fb10207668cc5b857c5710c213781d43925384 Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.506901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7cbdd57f54-rd6dk"] Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.531741 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.566920 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.578583 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bffc46f4d-tqbdl"] Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.634092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.668960 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.669147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94x5t\" (UniqueName: \"kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.669180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.669209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.669244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0\") pod \"0595f37c-6776-4055-833b-4d436c73604b\" (UID: \"0595f37c-6776-4055-833b-4d436c73604b\") " Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.747405 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t" (OuterVolumeSpecName: "kube-api-access-94x5t") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "kube-api-access-94x5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.773560 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94x5t\" (UniqueName: \"kubernetes.io/projected/0595f37c-6776-4055-833b-4d436c73604b-kube-api-access-94x5t\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:55 crc kubenswrapper[4790]: I0406 12:18:55.799623 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79975678d5-blp5h" Apr 06 12:18:55 crc kubenswrapper[4790]: W0406 12:18:55.913071 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea04d0b_6f21_4671_9b77_bbbac755b073.slice/crio-1dc5444a9fd1d169a5cc3dd40949da1cf74aab60c15673b0c176c7a0527dc0e7 WatchSource:0}: Error finding container 1dc5444a9fd1d169a5cc3dd40949da1cf74aab60c15673b0c176c7a0527dc0e7: Status 404 returned error can't find the container with id 1dc5444a9fd1d169a5cc3dd40949da1cf74aab60c15673b0c176c7a0527dc0e7 Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.166215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config" (OuterVolumeSpecName: "config") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.185768 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.206264 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.209192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.237399 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.247503 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0595f37c-6776-4055-833b-4d436c73604b" (UID: "0595f37c-6776-4055-833b-4d436c73604b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.290074 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.290104 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.290115 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.290123 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0595f37c-6776-4055-833b-4d436c73604b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331730 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331768 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d568b5b57-5x8cp" event={"ID":"2f7309c8-fdde-4e0e-9efa-ece286501ec5","Type":"ContainerStarted","Data":"15552c60a0c2164b9944e94d032707e21c3240c60a0325599d4b80cda63043b9"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79975678d5-blp5h" event={"ID":"0595f37c-6776-4055-833b-4d436c73604b","Type":"ContainerDied","Data":"dd09e5787ecc2c61b8d022a9b438cc0975bc83ed2795d6fc5de52901957291b1"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9b7c8b58-lzkxg" event={"ID":"b215a7cd-f428-4fbd-adbc-307b6c905894","Type":"ContainerStarted","Data":"9f800d58496976ccbd63610e9587f8f55f9ffaf34094319f4a88f911cff0469d"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331848 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerStarted","Data":"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerStarted","Data":"485574a5d1c8d548c1e386acfa5d4a1c641a2c2bbac3063926d71e3ac2acb7f0"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerStarted","Data":"8bee7bdfc53f332eb85abe56936d04fee68c34bcd67a1ed24cbb2f8ea6549835"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331886 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" event={"ID":"d431242a-f2f0-4780-85d0-9f2cfc8573ac","Type":"ContainerStarted","Data":"ea8fc4ec063bb6d3838de7c108fd1f71d0782454ca7cb1ba679a01b5b6d376b5"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331898 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerStarted","Data":"5599451463fac6edda0061ea91449090b5f17308bb01079a76146c509cf2f0f7"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerStarted","Data":"5e0f6efdbf7b8b5153db4330793efde2b398a769c815f5b2f015063c3532f2b7"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bffc46f4d-tqbdl" event={"ID":"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d","Type":"ContainerStarted","Data":"c894b3ba45e24532e3906d5f39fb10207668cc5b857c5710c213781d43925384"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.331941 4790 scope.go:117] "RemoveContainer" containerID="863050c8fb69a181a0a702f9c8239e08c9e35319e510db93acc7b6a08e460cfd" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.932850 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.969192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bffc46f4d-tqbdl" event={"ID":"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d","Type":"ContainerStarted","Data":"3ae09147dd87c3d1a2667ad75517ef850ecba9d7e10b97b1b070b3da4e69e5a8"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.975293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerStarted","Data":"316f0eba6f225afab2564dc4e79d6b6aa690f5809c54beed7d2c83fed038980b"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.981426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerStarted","Data":"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.986791 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79975678d5-blp5h"] Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.991728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerStarted","Data":"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217"} Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.993104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.993136 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:18:56 crc kubenswrapper[4790]: I0406 12:18:56.997947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerStarted","Data":"1dc5444a9fd1d169a5cc3dd40949da1cf74aab60c15673b0c176c7a0527dc0e7"} Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.003919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerStarted","Data":"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53"} Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.018058 4790 generic.go:334] "Generic (PLEG): container finished" podID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" exitCode=1 Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.018138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f"} Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.018170 4790 scope.go:117] "RemoveContainer" containerID="e894173738cccc6cbb9d1d32f79ff33799febb4b16a5bb2a1692363023807b22" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.018495 4790 scope.go:117] "RemoveContainer" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" Apr 06 12:18:57 crc kubenswrapper[4790]: E0406 12:18:57.018704 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.044112 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.044136 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.045777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9b7c8b58-lzkxg" event={"ID":"b215a7cd-f428-4fbd-adbc-307b6c905894","Type":"ContainerStarted","Data":"4228b2220363390c6746c0239e7d807dd8532aa5a0bca5ad63987bcf3ac86811"} Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.045841 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.045850 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.046255 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.046989 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d77d6889d-9r858" podStartSLOduration=4.046967038 podStartE2EDuration="4.046967038s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:57.016042975 +0000 UTC m=+1316.003785831" watchObservedRunningTime="2026-04-06 12:18:57.046967038 +0000 UTC m=+1316.034709914" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.074791 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b9b7c8b58-lzkxg" podStartSLOduration=4.074773148 podStartE2EDuration="4.074773148s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:57.067838511 +0000 UTC m=+1316.055581377" watchObservedRunningTime="2026-04-06 12:18:57.074773148 +0000 UTC m=+1316.062516014" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.479364 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bbf455d7c-c2ssf"] Apr 06 12:18:57 crc kubenswrapper[4790]: E0406 12:18:57.479966 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0595f37c-6776-4055-833b-4d436c73604b" containerName="init" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.479979 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0595f37c-6776-4055-833b-4d436c73604b" containerName="init" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.480167 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0595f37c-6776-4055-833b-4d436c73604b" containerName="init" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.481145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.493343 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.493555 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.498595 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bbf455d7c-c2ssf"] Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-httpd-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-internal-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637368 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbpc2\" (UniqueName: \"kubernetes.io/projected/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-kube-api-access-sbpc2\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637460 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-public-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637493 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-combined-ca-bundle\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.637535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-ovndb-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.693768 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0595f37c-6776-4055-833b-4d436c73604b" path="/var/lib/kubelet/pods/0595f37c-6776-4055-833b-4d436c73604b/volumes" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739334 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-public-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739396 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-combined-ca-bundle\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-ovndb-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-httpd-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739524 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-internal-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.739559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbpc2\" (UniqueName: \"kubernetes.io/projected/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-kube-api-access-sbpc2\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.746786 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.747450 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-public-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.748604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-httpd-config\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.748615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-internal-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.750535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-combined-ca-bundle\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.752342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-ovndb-tls-certs\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.762065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbpc2\" (UniqueName: \"kubernetes.io/projected/cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df-kube-api-access-sbpc2\") pod \"neutron-bbf455d7c-c2ssf\" (UID: \"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df\") " pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:57 crc kubenswrapper[4790]: I0406 12:18:57.827810 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.058391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6crt4" event={"ID":"25adb66a-8db5-48e1-b05c-526008a22e4f","Type":"ContainerStarted","Data":"4816d53015881df073e4dc66cced460501d1ad4d92c83193e5131e1ec4d6aab1"} Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.106093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerStarted","Data":"e825dfb21063f98af99c111c756d4e94c2f904a21c5a6d6060690afb5e141452"} Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.106139 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.106169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.123174 4790 generic.go:334] "Generic (PLEG): container finished" podID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerID="14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01" exitCode=0 Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.123289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerDied","Data":"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01"} Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.139301 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6crt4" podStartSLOduration=4.361319893 podStartE2EDuration="39.139282113s" podCreationTimestamp="2026-04-06 12:18:19 +0000 UTC" firstStartedPulling="2026-04-06 12:18:21.172933302 +0000 UTC m=+1280.160676168" lastFinishedPulling="2026-04-06 12:18:55.950895522 +0000 UTC m=+1314.938638388" observedRunningTime="2026-04-06 12:18:58.119229993 +0000 UTC m=+1317.106972859" watchObservedRunningTime="2026-04-06 12:18:58.139282113 +0000 UTC m=+1317.127024979" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.141045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerStarted","Data":"0a6b614aceab75ca4aa13e13b8603b058f587a20aa7333996f589a8a588baf69"} Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.145242 4790 scope.go:117] "RemoveContainer" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" Apr 06 12:18:58 crc kubenswrapper[4790]: E0406 12:18:58.145461 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.214586 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.214938 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.214608 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b48866f46-rtgzj" podStartSLOduration=5.214590683 podStartE2EDuration="5.214590683s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:58.172281863 +0000 UTC m=+1317.160024729" watchObservedRunningTime="2026-04-06 12:18:58.214590683 +0000 UTC m=+1317.202333549" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.228436 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.626468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.626561 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:18:58 crc kubenswrapper[4790]: I0406 12:18:58.633089 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 06 12:18:59 crc kubenswrapper[4790]: I0406 12:18:59.183753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerStarted","Data":"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960"} Apr 06 12:18:59 crc kubenswrapper[4790]: I0406 12:18:59.186205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:59 crc kubenswrapper[4790]: I0406 12:18:59.186229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:18:59 crc kubenswrapper[4790]: I0406 12:18:59.219622 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f66dc7c8d-wd7td" podStartSLOduration=6.219603804 podStartE2EDuration="6.219603804s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:18:59.213289294 +0000 UTC m=+1318.201032160" watchObservedRunningTime="2026-04-06 12:18:59.219603804 +0000 UTC m=+1318.207346670" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.154682 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.196902 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57899578c6-rh848"] Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.198863 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.201750 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.206682 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.207898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-combined-ca-bundle\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.207928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-public-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.207948 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data-custom\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.207970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpz6\" (UniqueName: \"kubernetes.io/projected/e2ee43de-a608-4710-a58d-60d49845cb7c-kube-api-access-8fpz6\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.208005 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.208114 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-internal-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.208136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2ee43de-a608-4710-a58d-60d49845cb7c-logs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.212453 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57899578c6-rh848"] Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-internal-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2ee43de-a608-4710-a58d-60d49845cb7c-logs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310251 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-combined-ca-bundle\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-public-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data-custom\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpz6\" (UniqueName: \"kubernetes.io/projected/e2ee43de-a608-4710-a58d-60d49845cb7c-kube-api-access-8fpz6\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.310335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.311476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2ee43de-a608-4710-a58d-60d49845cb7c-logs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.316248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-combined-ca-bundle\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.317399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.318052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-internal-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.318878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-public-tls-certs\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.333510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2ee43de-a608-4710-a58d-60d49845cb7c-config-data-custom\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.350299 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpz6\" (UniqueName: \"kubernetes.io/projected/e2ee43de-a608-4710-a58d-60d49845cb7c-kube-api-access-8fpz6\") pod \"barbican-api-57899578c6-rh848\" (UID: \"e2ee43de-a608-4710-a58d-60d49845cb7c\") " pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:00 crc kubenswrapper[4790]: I0406 12:19:00.521552 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:01 crc kubenswrapper[4790]: I0406 12:19:01.207850 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" containerID="cri-o://316f0eba6f225afab2564dc4e79d6b6aa690f5809c54beed7d2c83fed038980b" gracePeriod=30 Apr 06 12:19:01 crc kubenswrapper[4790]: I0406 12:19:01.208692 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" containerID="cri-o://e825dfb21063f98af99c111c756d4e94c2f904a21c5a6d6060690afb5e141452" gracePeriod=30 Apr 06 12:19:01 crc kubenswrapper[4790]: I0406 12:19:01.220107 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": EOF" Apr 06 12:19:01 crc kubenswrapper[4790]: I0406 12:19:01.245992 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": read tcp 10.217.0.2:54456->10.217.0.192:9311: read: connection reset by peer" Apr 06 12:19:01 crc kubenswrapper[4790]: I0406 12:19:01.246135 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": EOF" Apr 06 12:19:02 crc kubenswrapper[4790]: I0406 12:19:02.219668 4790 generic.go:334] "Generic (PLEG): container finished" podID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerID="316f0eba6f225afab2564dc4e79d6b6aa690f5809c54beed7d2c83fed038980b" exitCode=143 Apr 06 12:19:02 crc kubenswrapper[4790]: I0406 12:19:02.219767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerDied","Data":"316f0eba6f225afab2564dc4e79d6b6aa690f5809c54beed7d2c83fed038980b"} Apr 06 12:19:02 crc kubenswrapper[4790]: I0406 12:19:02.570006 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:19:03 crc kubenswrapper[4790]: I0406 12:19:03.888700 4790 scope.go:117] "RemoveContainer" containerID="f4856bc9c4f4020355729a7616098568851b3bec940fb783dbbbd66baf4c462b" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.020159 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": dial tcp 10.217.0.192:9311: connect: connection refused" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.020414 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": dial tcp 10.217.0.192:9311: connect: connection refused" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.171017 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.171449 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" containerID="cri-o://83288a99ab095dbe7b467050cf92654092fe49129052c4e8984bfc355883af90" gracePeriod=30 Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.175371 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" containerID="cri-o://c448c3fb360b405254cdd2a776c58a63e6c644902dc0011b721e71be59960e46" gracePeriod=30 Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.211084 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.181:9322/\": EOF" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.211435 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9322/\": EOF" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.279416 4790 generic.go:334] "Generic (PLEG): container finished" podID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerID="e825dfb21063f98af99c111c756d4e94c2f904a21c5a6d6060690afb5e141452" exitCode=0 Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.279458 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerDied","Data":"e825dfb21063f98af99c111c756d4e94c2f904a21c5a6d6060690afb5e141452"} Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.318958 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:04 crc kubenswrapper[4790]: I0406 12:19:04.319673 4790 scope.go:117] "RemoveContainer" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" Apr 06 12:19:04 crc kubenswrapper[4790]: E0406 12:19:04.319977 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.293143 4790 generic.go:334] "Generic (PLEG): container finished" podID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerID="c448c3fb360b405254cdd2a776c58a63e6c644902dc0011b721e71be59960e46" exitCode=0 Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.293171 4790 generic.go:334] "Generic (PLEG): container finished" podID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerID="83288a99ab095dbe7b467050cf92654092fe49129052c4e8984bfc355883af90" exitCode=143 Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.293189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerDied","Data":"c448c3fb360b405254cdd2a776c58a63e6c644902dc0011b721e71be59960e46"} Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.293212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerDied","Data":"83288a99ab095dbe7b467050cf92654092fe49129052c4e8984bfc355883af90"} Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.357095 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:19:05 crc kubenswrapper[4790]: I0406 12:19:05.560455 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.346024 4790 generic.go:334] "Generic (PLEG): container finished" podID="25adb66a-8db5-48e1-b05c-526008a22e4f" containerID="4816d53015881df073e4dc66cced460501d1ad4d92c83193e5131e1ec4d6aab1" exitCode=0 Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.346091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6crt4" event={"ID":"25adb66a-8db5-48e1-b05c-526008a22e4f","Type":"ContainerDied","Data":"4816d53015881df073e4dc66cced460501d1ad4d92c83193e5131e1ec4d6aab1"} Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.510237 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.603047 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca\") pod \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.603095 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle\") pod \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.603274 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data\") pod \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.603338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwbwr\" (UniqueName: \"kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr\") pod \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.603417 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs\") pod \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\" (UID: \"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60\") " Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.604233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs" (OuterVolumeSpecName: "logs") pod "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" (UID: "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.608970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr" (OuterVolumeSpecName: "kube-api-access-dwbwr") pod "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" (UID: "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60"). InnerVolumeSpecName "kube-api-access-dwbwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.631736 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" (UID: "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.683771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" (UID: "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.699564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data" (OuterVolumeSpecName: "config-data") pod "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" (UID: "d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.707127 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwbwr\" (UniqueName: \"kubernetes.io/projected/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-kube-api-access-dwbwr\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.707154 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.707193 4790 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.707204 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.707212 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:09 crc kubenswrapper[4790]: I0406 12:19:09.788426 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bbf455d7c-c2ssf"] Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.148017 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:19:10 crc kubenswrapper[4790]: W0406 12:19:10.152877 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf1c464e_5f26_4b8f_a4a0_8555a4c8c7df.slice/crio-6754bf89a2869a6b95ea953f2fbb6a5b191576a32e69916c0abf65c79aea4e05 WatchSource:0}: Error finding container 6754bf89a2869a6b95ea953f2fbb6a5b191576a32e69916c0abf65c79aea4e05: Status 404 returned error can't find the container with id 6754bf89a2869a6b95ea953f2fbb6a5b191576a32e69916c0abf65c79aea4e05 Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.220297 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs\") pod \"45e1b466-7e6f-40af-80ed-d3478421d0fe\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.220454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szvf\" (UniqueName: \"kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf\") pod \"45e1b466-7e6f-40af-80ed-d3478421d0fe\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.220551 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data\") pod \"45e1b466-7e6f-40af-80ed-d3478421d0fe\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.220649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle\") pod \"45e1b466-7e6f-40af-80ed-d3478421d0fe\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.220732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom\") pod \"45e1b466-7e6f-40af-80ed-d3478421d0fe\" (UID: \"45e1b466-7e6f-40af-80ed-d3478421d0fe\") " Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.221253 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs" (OuterVolumeSpecName: "logs") pod "45e1b466-7e6f-40af-80ed-d3478421d0fe" (UID: "45e1b466-7e6f-40af-80ed-d3478421d0fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.221626 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45e1b466-7e6f-40af-80ed-d3478421d0fe-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.225748 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf" (OuterVolumeSpecName: "kube-api-access-4szvf") pod "45e1b466-7e6f-40af-80ed-d3478421d0fe" (UID: "45e1b466-7e6f-40af-80ed-d3478421d0fe"). InnerVolumeSpecName "kube-api-access-4szvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.226322 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45e1b466-7e6f-40af-80ed-d3478421d0fe" (UID: "45e1b466-7e6f-40af-80ed-d3478421d0fe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.282126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45e1b466-7e6f-40af-80ed-d3478421d0fe" (UID: "45e1b466-7e6f-40af-80ed-d3478421d0fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.324852 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szvf\" (UniqueName: \"kubernetes.io/projected/45e1b466-7e6f-40af-80ed-d3478421d0fe-kube-api-access-4szvf\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.324879 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.324889 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.371376 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b48866f46-rtgzj" event={"ID":"45e1b466-7e6f-40af-80ed-d3478421d0fe","Type":"ContainerDied","Data":"5599451463fac6edda0061ea91449090b5f17308bb01079a76146c509cf2f0f7"} Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.371601 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b48866f46-rtgzj" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.371430 4790 scope.go:117] "RemoveContainer" containerID="e825dfb21063f98af99c111c756d4e94c2f904a21c5a6d6060690afb5e141452" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.374746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf455d7c-c2ssf" event={"ID":"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df","Type":"ContainerStarted","Data":"6754bf89a2869a6b95ea953f2fbb6a5b191576a32e69916c0abf65c79aea4e05"} Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.378940 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.379005 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60","Type":"ContainerDied","Data":"bd7e2e816f7f7d7700c0256e1213b1291499a151ee074b2625ed01ab8b6511df"} Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.404281 4790 scope.go:117] "RemoveContainer" containerID="316f0eba6f225afab2564dc4e79d6b6aa690f5809c54beed7d2c83fed038980b" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.568973 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data" (OuterVolumeSpecName: "config-data") pod "45e1b466-7e6f-40af-80ed-d3478421d0fe" (UID: "45e1b466-7e6f-40af-80ed-d3478421d0fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.631962 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45e1b466-7e6f-40af-80ed-d3478421d0fe-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.643979 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57899578c6-rh848"] Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.813794 4790 scope.go:117] "RemoveContainer" containerID="c448c3fb360b405254cdd2a776c58a63e6c644902dc0011b721e71be59960e46" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.975861 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6crt4" Apr 06 12:19:10 crc kubenswrapper[4790]: I0406 12:19:10.987533 4790 scope.go:117] "RemoveContainer" containerID="83288a99ab095dbe7b467050cf92654092fe49129052c4e8984bfc355883af90" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040083 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040207 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040214 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040241 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040354 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfsn4\" (UniqueName: \"kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040440 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.040497 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data\") pod \"25adb66a-8db5-48e1-b05c-526008a22e4f\" (UID: \"25adb66a-8db5-48e1-b05c-526008a22e4f\") " Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.041078 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25adb66a-8db5-48e1-b05c-526008a22e4f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.055771 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.078233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.085586 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts" (OuterVolumeSpecName: "scripts") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.087692 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b48866f46-rtgzj"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.102991 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.105119 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4" (OuterVolumeSpecName: "kube-api-access-dfsn4") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "kube-api-access-dfsn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.115794 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.133307 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: E0406 12:19:11.133942 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.134093 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: E0406 12:19:11.134283 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.134388 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" Apr 06 12:19:11 crc kubenswrapper[4790]: E0406 12:19:11.134979 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135007 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" Apr 06 12:19:11 crc kubenswrapper[4790]: E0406 12:19:11.135021 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" containerName="cinder-db-sync" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135027 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" containerName="cinder-db-sync" Apr 06 12:19:11 crc kubenswrapper[4790]: E0406 12:19:11.135067 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135074 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135408 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135420 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135437 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135444 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" containerName="cinder-db-sync" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.135461 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.136508 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.139396 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.139444 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.139909 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.143468 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.143497 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.143506 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfsn4\" (UniqueName: \"kubernetes.io/projected/25adb66a-8db5-48e1-b05c-526008a22e4f-kube-api-access-dfsn4\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.144994 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.244843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.244896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56hp\" (UniqueName: \"kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.244952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.244975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.245206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.245285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.245429 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.307260 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.335318 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data" (OuterVolumeSpecName: "config-data") pod "25adb66a-8db5-48e1-b05c-526008a22e4f" (UID: "25adb66a-8db5-48e1-b05c-526008a22e4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347448 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347472 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56hp\" (UniqueName: \"kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347619 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347635 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25adb66a-8db5-48e1-b05c-526008a22e4f-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.347974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.352766 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.353278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.354559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.357815 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.362024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.366445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56hp\" (UniqueName: \"kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp\") pod \"watcher-api-0\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.426302 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerStarted","Data":"cb530553644b13d5011b4a5a55ffce32c787f90749f35bd6a2daf24ac75b4554"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.426783 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.437014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6crt4" event={"ID":"25adb66a-8db5-48e1-b05c-526008a22e4f","Type":"ContainerDied","Data":"a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.437089 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88185b5a36429d57d750c4cc14b9f2e26f1f1d6db726497ed3b32cd109e2174" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.437163 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6crt4" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.448181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d568b5b57-5x8cp" event={"ID":"2f7309c8-fdde-4e0e-9efa-ece286501ec5","Type":"ContainerStarted","Data":"6c882666faadb41e825afc033f2f52bd9bfb1c0d35c8fec3470cfc70586a4750"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.460666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.472218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerStarted","Data":"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.505442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerStarted","Data":"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.523026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bffc46f4d-tqbdl" event={"ID":"54d90c86-6e3b-49d3-a50f-eefe94ef8d6d","Type":"ContainerStarted","Data":"fc0fae07a1373599e4d9f99a84bc20f73c5efa858bd4cd29483e25ed626f1c3c"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.524319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.524803 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.536682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57899578c6-rh848" event={"ID":"e2ee43de-a608-4710-a58d-60d49845cb7c","Type":"ContainerStarted","Data":"83ded059fb287a15c0c45c071f8678aade12149ffc0563a6d5aaeb285ef9c5f8"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.536733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57899578c6-rh848" event={"ID":"e2ee43de-a608-4710-a58d-60d49845cb7c","Type":"ContainerStarted","Data":"c643969cd0bf868ca7e26872db3784e2255553ce666a2c86ce5bd12e643912b3"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.566731 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fb546dccb-rmb8d" podStartSLOduration=17.566710513 podStartE2EDuration="17.566710513s" podCreationTimestamp="2026-04-06 12:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:11.46420862 +0000 UTC m=+1330.451951496" watchObservedRunningTime="2026-04-06 12:19:11.566710513 +0000 UTC m=+1330.554453379" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.584845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerStarted","Data":"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.585074 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.609222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf455d7c-c2ssf" event={"ID":"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df","Type":"ContainerStarted","Data":"4d9277170c3fee716d3a366ebef5a49cbabd5b49e4add20d11a5a49a4f1136af"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.613593 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.621433 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.630241 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.630453 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.630522 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.630580 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-q84z7" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.634196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" event={"ID":"d431242a-f2f0-4780-85d0-9f2cfc8573ac","Type":"ContainerStarted","Data":"13fdbe63a892210ca2d28f478f5cddb542522945a2d85b72b46776aee3fdfa93"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.637639 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7bffc46f4d-tqbdl" podStartSLOduration=18.637618283 podStartE2EDuration="18.637618283s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:11.568629124 +0000 UTC m=+1330.556371990" watchObservedRunningTime="2026-04-06 12:19:11.637618283 +0000 UTC m=+1330.625361149" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.669866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.669926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.669980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.670102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.670192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.670210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvsxk\" (UniqueName: \"kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.678889 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.687895 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" podStartSLOduration=18.687874738 podStartE2EDuration="18.687874738s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:11.616090103 +0000 UTC m=+1330.603832989" watchObservedRunningTime="2026-04-06 12:19:11.687874738 +0000 UTC m=+1330.675617604" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.734735 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-central-agent" containerID="cri-o://e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298" gracePeriod=30 Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.734896 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="proxy-httpd" containerID="cri-o://7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc" gracePeriod=30 Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.734933 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="sg-core" containerID="cri-o://c9aa44481bb148c149e9f2ec055ff375c2817cad5ce4db45971c08d7612d5b6c" gracePeriod=30 Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.734960 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-notification-agent" containerID="cri-o://eb3ed014da04a04e2d7acb6c57264f86b56aac0d3e2c77e1a3d9f84516c80a34" gracePeriod=30 Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.797236 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" path="/var/lib/kubelet/pods/45e1b466-7e6f-40af-80ed-d3478421d0fe/volumes" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.797917 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" path="/var/lib/kubelet/pods/d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60/volumes" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.798495 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.798518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerStarted","Data":"7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc"} Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvsxk\" (UniqueName: \"kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.820306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.824060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.854499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.865422 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.882506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.883646 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.885771 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:11 crc kubenswrapper[4790]: I0406 12:19:11.903588 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvsxk\" (UniqueName: \"kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk\") pod \"cinder-scheduler-0\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.024645 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.160175 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.167572 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.255114 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.256605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.256735 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.256775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.257046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.257091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmg4\" (UniqueName: \"kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.257141 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.309733 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.195500111 podStartE2EDuration="53.309715414s" podCreationTimestamp="2026-04-06 12:18:19 +0000 UTC" firstStartedPulling="2026-04-06 12:18:21.17854796 +0000 UTC m=+1280.166290816" lastFinishedPulling="2026-04-06 12:19:10.292763253 +0000 UTC m=+1329.280506119" observedRunningTime="2026-04-06 12:19:11.981554451 +0000 UTC m=+1330.969297317" watchObservedRunningTime="2026-04-06 12:19:12.309715414 +0000 UTC m=+1331.297458280" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.341104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.347695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.353111 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361447 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmg4\" (UniqueName: \"kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.361722 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.362410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.362779 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.363079 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.363951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.367439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.449561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmg4\" (UniqueName: \"kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4\") pod \"dnsmasq-dns-b8f5f7879-l4wlk\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473146 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxhm\" (UniqueName: \"kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:12 crc kubenswrapper[4790]: I0406 12:19:12.473214 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.541562 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.574494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxhm\" (UniqueName: \"kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.574528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.575063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.575087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.575110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.575141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.575158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.579023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.579193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.584496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.584649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.585802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.596376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.598301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxhm\" (UniqueName: \"kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm\") pod \"cinder-api-0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.606094 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.880303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:13 crc kubenswrapper[4790]: E0406 12:19:12.902994 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0dc22e_42cc_4916_a598_99d4e2c99ae7.slice/crio-7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0dc22e_42cc_4916_a598_99d4e2c99ae7.slice/crio-conmon-7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0dc22e_42cc_4916_a598_99d4e2c99ae7.slice/crio-e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa0dc22e_42cc_4916_a598_99d4e2c99ae7.slice/crio-conmon-e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.944669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerStarted","Data":"55931a5f66d163a71d214d04d4a1a799488e45d8e8f8ce56a01f943850e38dd4"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950210 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerID="7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc" exitCode=0 Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950240 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerID="c9aa44481bb148c149e9f2ec055ff375c2817cad5ce4db45971c08d7612d5b6c" exitCode=2 Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950248 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerID="e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298" exitCode=0 Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerDied","Data":"7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950382 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerDied","Data":"c9aa44481bb148c149e9f2ec055ff375c2817cad5ce4db45971c08d7612d5b6c"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.950396 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerDied","Data":"e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.951746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d568b5b57-5x8cp" event={"ID":"2f7309c8-fdde-4e0e-9efa-ece286501ec5","Type":"ContainerStarted","Data":"2af25f0374ce1e133fbe93f8675fce9f93b93ca64beadd5edb78946500e29ec2"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.954654 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57899578c6-rh848" event={"ID":"e2ee43de-a608-4710-a58d-60d49845cb7c","Type":"ContainerStarted","Data":"67e112231f934b56db15e657ce1a37209569161399ad9d10818511a01e1a6fda"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.955243 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.955314 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.961697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerStarted","Data":"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.970713 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d568b5b57-5x8cp" podStartSLOduration=7.977816002 podStartE2EDuration="19.970692345s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="2026-04-06 12:18:55.19453171 +0000 UTC m=+1314.182274576" lastFinishedPulling="2026-04-06 12:19:07.187408043 +0000 UTC m=+1326.175150919" observedRunningTime="2026-04-06 12:19:12.970555461 +0000 UTC m=+1331.958298327" watchObservedRunningTime="2026-04-06 12:19:12.970692345 +0000 UTC m=+1331.958435211" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.975576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bbf455d7c-c2ssf" event={"ID":"cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df","Type":"ContainerStarted","Data":"94e990fa67dbe4074ba35802a7fa93e09a0200c9d8d6e976b587134f3c496220"} Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:12.976449 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.013201 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.017651 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-67668d4bd9-69x7l" podStartSLOduration=8.515032943 podStartE2EDuration="21.01762915s" podCreationTimestamp="2026-04-06 12:18:52 +0000 UTC" firstStartedPulling="2026-04-06 12:18:54.614342127 +0000 UTC m=+1313.602084993" lastFinishedPulling="2026-04-06 12:19:07.116938334 +0000 UTC m=+1326.104681200" observedRunningTime="2026-04-06 12:19:13.003663683 +0000 UTC m=+1331.991406549" watchObservedRunningTime="2026-04-06 12:19:13.01762915 +0000 UTC m=+1332.005372016" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.049135 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" podStartSLOduration=8.435981002 podStartE2EDuration="21.049112588s" podCreationTimestamp="2026-04-06 12:18:52 +0000 UTC" firstStartedPulling="2026-04-06 12:18:54.535503782 +0000 UTC m=+1313.523246648" lastFinishedPulling="2026-04-06 12:19:07.148635368 +0000 UTC m=+1326.136378234" observedRunningTime="2026-04-06 12:19:13.027158767 +0000 UTC m=+1332.014901633" watchObservedRunningTime="2026-04-06 12:19:13.049112588 +0000 UTC m=+1332.036855454" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.080288 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57899578c6-rh848" podStartSLOduration=13.080265468 podStartE2EDuration="13.080265468s" podCreationTimestamp="2026-04-06 12:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:13.075239672 +0000 UTC m=+1332.062982538" watchObservedRunningTime="2026-04-06 12:19:13.080265468 +0000 UTC m=+1332.068008334" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.587991 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.614987 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bbf455d7c-c2ssf" podStartSLOduration=16.614970527 podStartE2EDuration="16.614970527s" podCreationTimestamp="2026-04-06 12:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:13.106027502 +0000 UTC m=+1332.093770368" watchObservedRunningTime="2026-04-06 12:19:13.614970527 +0000 UTC m=+1332.602713383" Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.841004 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.858859 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:13 crc kubenswrapper[4790]: I0406 12:19:13.877330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.017036 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" event={"ID":"d431242a-f2f0-4780-85d0-9f2cfc8573ac","Type":"ContainerStarted","Data":"a35eaf59c2b422bc5500cc9cb09f51309fc1bb4f3b8a3e48d7f822f74d2d1862"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.023936 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": dial tcp 10.217.0.192:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.023940 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b48866f46-rtgzj" podUID="45e1b466-7e6f-40af-80ed-d3478421d0fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.192:9311/healthcheck\": dial tcp 10.217.0.192:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.030113 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" event={"ID":"26a2b37d-3798-4197-b112-822d6b7e3f5b","Type":"ContainerStarted","Data":"ececf77662fc6c033168d6bbc78e65b17f662b84c780e12d10ede5b325dcbc61"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.033848 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerStarted","Data":"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.035581 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerStarted","Data":"ffdfdde3e64681b14fad6230b2c1fb8f5497f0a5bb65f561ff0e5fffad9b35e8"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.036632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerStarted","Data":"1b2285a3a80ee49fec32a8ad0133af357b09090b866746f21b3997cab05ef521"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.041419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerStarted","Data":"d645708a60606639082464824ac049745ed311edacd34096b43dbb1aeacada33"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.041547 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="dnsmasq-dns" containerID="cri-o://6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274" gracePeriod=10 Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.041642 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.041661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerStarted","Data":"a4e7b78989ccf895b8d0eb92bad17ac9d8dfaa46bb07c48104540a7db8c71e8f"} Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.061781 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7cbdd57f54-rd6dk" podStartSLOduration=6.359512644 podStartE2EDuration="21.061756006s" podCreationTimestamp="2026-04-06 12:18:53 +0000 UTC" firstStartedPulling="2026-04-06 12:18:55.510655699 +0000 UTC m=+1314.498398565" lastFinishedPulling="2026-04-06 12:19:10.212899061 +0000 UTC m=+1329.200641927" observedRunningTime="2026-04-06 12:19:14.053160245 +0000 UTC m=+1333.040903111" watchObservedRunningTime="2026-04-06 12:19:14.061756006 +0000 UTC m=+1333.049498872" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.102717 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.102847 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.102813863 podStartE2EDuration="3.102813863s" podCreationTimestamp="2026-04-06 12:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:14.096641806 +0000 UTC m=+1333.084384672" watchObservedRunningTime="2026-04-06 12:19:14.102813863 +0000 UTC m=+1333.090556729" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.181335 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.181:9322/\": dial tcp 10.217.0.181:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.181784 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d8ea7168-ba0e-4800-8ddc-a1cfa16bfd60" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.307145 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.320002 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.320800 4790 scope.go:117] "RemoveContainer" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" Apr 06 12:19:14 crc kubenswrapper[4790]: I0406 12:19:14.320993 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.027536 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.099152 4790 generic.go:334] "Generic (PLEG): container finished" podID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerID="6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274" exitCode=0 Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.099233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerDied","Data":"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274"} Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.099267 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" event={"ID":"7a0f4edc-7cc8-4167-89fe-c9544592c705","Type":"ContainerDied","Data":"5e0f6efdbf7b8b5153db4330793efde2b398a769c815f5b2f015063c3532f2b7"} Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.099287 4790 scope.go:117] "RemoveContainer" containerID="6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.099452 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5467468dd7-fc25g" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104297 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104411 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj9ln\" (UniqueName: \"kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.104572 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0\") pod \"7a0f4edc-7cc8-4167-89fe-c9544592c705\" (UID: \"7a0f4edc-7cc8-4167-89fe-c9544592c705\") " Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.109991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149"} Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.111618 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67668d4bd9-69x7l" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker-log" containerID="cri-o://03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b" gracePeriod=30 Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.114245 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-67668d4bd9-69x7l" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker" containerID="cri-o://6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e" gracePeriod=30 Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.117496 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener-log" containerID="cri-o://02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb" gracePeriod=30 Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.117622 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener" containerID="cri-o://d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d" gracePeriod=30 Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.129053 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln" (OuterVolumeSpecName: "kube-api-access-qj9ln") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "kube-api-access-qj9ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.204092 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.208626 4790 scope.go:117] "RemoveContainer" containerID="14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.209015 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config" (OuterVolumeSpecName: "config") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.225031 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj9ln\" (UniqueName: \"kubernetes.io/projected/7a0f4edc-7cc8-4167-89fe-c9544592c705-kube-api-access-qj9ln\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.225079 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.225091 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.250111 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.260952 4790 scope.go:117] "RemoveContainer" containerID="6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274" Apr 06 12:19:15 crc kubenswrapper[4790]: E0406 12:19:15.268932 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274\": container with ID starting with 6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274 not found: ID does not exist" containerID="6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.268987 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274"} err="failed to get container status \"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274\": rpc error: code = NotFound desc = could not find container \"6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274\": container with ID starting with 6286d7f9a98fe9e44a73b160fc60fe2e600b8fbe86fd38835e05d254ae9a7274 not found: ID does not exist" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.269019 4790 scope.go:117] "RemoveContainer" containerID="14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01" Apr 06 12:19:15 crc kubenswrapper[4790]: E0406 12:19:15.272657 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01\": container with ID starting with 14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01 not found: ID does not exist" containerID="14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.272712 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01"} err="failed to get container status \"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01\": rpc error: code = NotFound desc = could not find container \"14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01\": container with ID starting with 14b0f24c7dd364f24a3f3bbec6126de30de8d35f083f68ccfe5c3ef8fa134e01 not found: ID does not exist" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.276704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.323423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a0f4edc-7cc8-4167-89fe-c9544592c705" (UID: "7a0f4edc-7cc8-4167-89fe-c9544592c705"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.326333 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.326363 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.326372 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a0f4edc-7cc8-4167-89fe-c9544592c705-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.522474 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.534198 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5467468dd7-fc25g"] Apr 06 12:19:15 crc kubenswrapper[4790]: I0406 12:19:15.691040 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" path="/var/lib/kubelet/pods/7a0f4edc-7cc8-4167-89fe-c9544592c705/volumes" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.134480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerStarted","Data":"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.143442 4790 generic.go:334] "Generic (PLEG): container finished" podID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerID="03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b" exitCode=143 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.143569 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerDied","Data":"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.159175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerStarted","Data":"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.167893 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerID="eb3ed014da04a04e2d7acb6c57264f86b56aac0d3e2c77e1a3d9f84516c80a34" exitCode=0 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.167964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerDied","Data":"eb3ed014da04a04e2d7acb6c57264f86b56aac0d3e2c77e1a3d9f84516c80a34"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.173041 4790 generic.go:334] "Generic (PLEG): container finished" podID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerID="c1290455893fd32e8bad27fb8c94a1dbde6e53b2f454831fce3d068048166ded" exitCode=0 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.173106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" event={"ID":"26a2b37d-3798-4197-b112-822d6b7e3f5b","Type":"ContainerDied","Data":"c1290455893fd32e8bad27fb8c94a1dbde6e53b2f454831fce3d068048166ded"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.226291 4790 generic.go:334] "Generic (PLEG): container finished" podID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerID="02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb" exitCode=143 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.226410 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerDied","Data":"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb"} Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.252046 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.465140 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.572515 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690610 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftx5g\" (UniqueName: \"kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.690955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts\") pod \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\" (UID: \"fa0dc22e-42cc-4916-a598-99d4e2c99ae7\") " Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.691128 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.692035 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.692064 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.700123 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts" (OuterVolumeSpecName: "scripts") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.716746 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bffc46f4d-tqbdl" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.719187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g" (OuterVolumeSpecName: "kube-api-access-ftx5g") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "kube-api-access-ftx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.805323 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.805361 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftx5g\" (UniqueName: \"kubernetes.io/projected/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-kube-api-access-ftx5g\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.841714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.842212 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f66dc7c8d-wd7td" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-log" containerID="cri-o://71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53" gracePeriod=30 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.842911 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5f66dc7c8d-wd7td" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-api" containerID="cri-o://7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960" gracePeriod=30 Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.844523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.850200 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-5f66dc7c8d-wd7td" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.189:8778/\": read tcp 10.217.0.2:57016->10.217.0.189:8778: read: connection reset by peer" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.931719 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:16 crc kubenswrapper[4790]: I0406 12:19:16.939299 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.008380 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data" (OuterVolumeSpecName: "config-data") pod "fa0dc22e-42cc-4916-a598-99d4e2c99ae7" (UID: "fa0dc22e-42cc-4916-a598-99d4e2c99ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.034223 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.034250 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa0dc22e-42cc-4916-a598-99d4e2c99ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.262235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa0dc22e-42cc-4916-a598-99d4e2c99ae7","Type":"ContainerDied","Data":"e0a118a29d6048e9af5f3a073b228080ade80a4fd81a341f1422aba7c748a02a"} Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.262544 4790 scope.go:117] "RemoveContainer" containerID="7694f5f4d951af8d19f92e41ac690e50b8ba8ec8718a7569743a01e1a7011bfc" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.262689 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.274351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" event={"ID":"26a2b37d-3798-4197-b112-822d6b7e3f5b","Type":"ContainerStarted","Data":"b14a9d2e3ec91ebd1630ec0c769361b2e4018a864b3249c674142b8785d39eeb"} Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.274468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.284212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerStarted","Data":"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474"} Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.296490 4790 generic.go:334] "Generic (PLEG): container finished" podID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerID="71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53" exitCode=143 Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.296605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerDied","Data":"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53"} Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.300040 4790 scope.go:117] "RemoveContainer" containerID="c9aa44481bb148c149e9f2ec055ff375c2817cad5ce4db45971c08d7612d5b6c" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.329689 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.330299 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api-log" containerID="cri-o://9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a" gracePeriod=30 Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.330518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerStarted","Data":"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703"} Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.330553 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.330581 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api" containerID="cri-o://f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703" gracePeriod=30 Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.339290 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" podStartSLOduration=6.339272986 podStartE2EDuration="6.339272986s" podCreationTimestamp="2026-04-06 12:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:17.315108645 +0000 UTC m=+1336.302851511" watchObservedRunningTime="2026-04-06 12:19:17.339272986 +0000 UTC m=+1336.327015852" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.379490 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.401809 4790 scope.go:117] "RemoveContainer" containerID="eb3ed014da04a04e2d7acb6c57264f86b56aac0d3e2c77e1a3d9f84516c80a34" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.411479 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.435329 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.996954612 podStartE2EDuration="6.435308044s" podCreationTimestamp="2026-04-06 12:19:11 +0000 UTC" firstStartedPulling="2026-04-06 12:19:13.884534131 +0000 UTC m=+1332.872276997" lastFinishedPulling="2026-04-06 12:19:14.322887563 +0000 UTC m=+1333.310630429" observedRunningTime="2026-04-06 12:19:17.387421503 +0000 UTC m=+1336.375164379" watchObservedRunningTime="2026-04-06 12:19:17.435308044 +0000 UTC m=+1336.423050910" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437387 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437884 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="sg-core" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437897 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="sg-core" Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437906 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="dnsmasq-dns" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437914 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="dnsmasq-dns" Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437926 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="proxy-httpd" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437933 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="proxy-httpd" Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437960 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-notification-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437966 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-notification-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437978 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-central-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.437983 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-central-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: E0406 12:19:17.437994 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="init" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438000 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="init" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438181 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="proxy-httpd" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438210 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-central-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438220 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="ceilometer-notification-agent" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438230 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" containerName="sg-core" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.438246 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0f4edc-7cc8-4167-89fe-c9544592c705" containerName="dnsmasq-dns" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.440231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.444341 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.445869 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.470936 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.470917154 podStartE2EDuration="5.470917154s" podCreationTimestamp="2026-04-06 12:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:17.455000275 +0000 UTC m=+1336.442743141" watchObservedRunningTime="2026-04-06 12:19:17.470917154 +0000 UTC m=+1336.458660020" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.471971 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.545900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.545982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.546048 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.546070 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.546097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.546119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tx2f\" (UniqueName: \"kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.546135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.566563 4790 scope.go:117] "RemoveContainer" containerID="e3ec0447b2418d52ffc1d0c8e34fae15b1ddc5bb33bceb5270c52ad589189298" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.649276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.649344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tx2f\" (UniqueName: \"kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.649379 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.649443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.649520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.650338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.650383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.650439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.650725 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.659223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.659850 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.676199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.676372 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.729282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tx2f\" (UniqueName: \"kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f\") pod \"ceilometer-0\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " pod="openstack/ceilometer-0" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.760247 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0dc22e-42cc-4916-a598-99d4e2c99ae7" path="/var/lib/kubelet/pods/fa0dc22e-42cc-4916-a598-99d4e2c99ae7/volumes" Apr 06 12:19:17 crc kubenswrapper[4790]: I0406 12:19:17.864819 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:18 crc kubenswrapper[4790]: I0406 12:19:18.377867 4790 generic.go:334] "Generic (PLEG): container finished" podID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerID="9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a" exitCode=143 Apr 06 12:19:18 crc kubenswrapper[4790]: I0406 12:19:18.378915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerDied","Data":"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a"} Apr 06 12:19:18 crc kubenswrapper[4790]: I0406 12:19:18.463598 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:18 crc kubenswrapper[4790]: I0406 12:19:18.592807 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.326914 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401106 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401171 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjrl2\" (UniqueName: \"kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401228 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401241 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.401342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs\") pod \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\" (UID: \"c7dbca29-a7f8-4547-9deb-75ee69b773f9\") " Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.417906 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs" (OuterVolumeSpecName: "logs") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.417971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts" (OuterVolumeSpecName: "scripts") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.438140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2" (OuterVolumeSpecName: "kube-api-access-jjrl2") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "kube-api-access-jjrl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.464933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerStarted","Data":"dc4cb891f5404c0701f7a786903b19e4e32c7132230af02cec5cb9bce3d0f065"} Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.465163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerStarted","Data":"bafbdff85bdf83f589e0c1119e325977fcc91016504b0304123a33baafdd456a"} Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.465220 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerStarted","Data":"c0127ac2e84b8a5b3ba089a42a493470e03b861e20df392ae8f70655d5fa7a18"} Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.479112 4790 generic.go:334] "Generic (PLEG): container finished" podID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerID="7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960" exitCode=0 Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.479159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerDied","Data":"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960"} Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.479189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f66dc7c8d-wd7td" event={"ID":"c7dbca29-a7f8-4547-9deb-75ee69b773f9","Type":"ContainerDied","Data":"8bee7bdfc53f332eb85abe56936d04fee68c34bcd67a1ed24cbb2f8ea6549835"} Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.479210 4790 scope.go:117] "RemoveContainer" containerID="7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.479433 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f66dc7c8d-wd7td" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.505659 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.505711 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7dbca29-a7f8-4547-9deb-75ee69b773f9-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.505723 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjrl2\" (UniqueName: \"kubernetes.io/projected/c7dbca29-a7f8-4547-9deb-75ee69b773f9-kube-api-access-jjrl2\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.523961 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.586411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data" (OuterVolumeSpecName: "config-data") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.610955 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.610982 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.667372 4790 scope.go:117] "RemoveContainer" containerID="71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.685038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.700954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7dbca29-a7f8-4547-9deb-75ee69b773f9" (UID: "c7dbca29-a7f8-4547-9deb-75ee69b773f9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.713491 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.713519 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7dbca29-a7f8-4547-9deb-75ee69b773f9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.715199 4790 scope.go:117] "RemoveContainer" containerID="7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960" Apr 06 12:19:19 crc kubenswrapper[4790]: E0406 12:19:19.717273 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960\": container with ID starting with 7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960 not found: ID does not exist" containerID="7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.717315 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960"} err="failed to get container status \"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960\": rpc error: code = NotFound desc = could not find container \"7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960\": container with ID starting with 7ace91d4c78e3cc9f5a933f0cec9481841467304fd8f1ec7a052d9065229b960 not found: ID does not exist" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.717339 4790 scope.go:117] "RemoveContainer" containerID="71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53" Apr 06 12:19:19 crc kubenswrapper[4790]: E0406 12:19:19.720560 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53\": container with ID starting with 71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53 not found: ID does not exist" containerID="71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.720612 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53"} err="failed to get container status \"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53\": rpc error: code = NotFound desc = could not find container \"71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53\": container with ID starting with 71cf26499e4795975d1c826c473f3bed994cf84a0fb36371f111d2f1968b2a53 not found: ID does not exist" Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.867907 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:19:19 crc kubenswrapper[4790]: I0406 12:19:19.876972 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5f66dc7c8d-wd7td"] Apr 06 12:19:20 crc kubenswrapper[4790]: I0406 12:19:20.495335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerStarted","Data":"84cc9fd78266aa9f4f1c674114170aa87e03e297e0dbaa9d11d16e5792698feb"} Apr 06 12:19:20 crc kubenswrapper[4790]: I0406 12:19:20.497813 4790 generic.go:334] "Generic (PLEG): container finished" podID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" exitCode=1 Apr 06 12:19:20 crc kubenswrapper[4790]: I0406 12:19:20.497868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149"} Apr 06 12:19:20 crc kubenswrapper[4790]: I0406 12:19:20.497932 4790 scope.go:117] "RemoveContainer" containerID="05f826b4bcbbeefb4e6e03b3d3a348093c793ba2a4c816fcd856d15331df143f" Apr 06 12:19:20 crc kubenswrapper[4790]: I0406 12:19:20.498541 4790 scope.go:117] "RemoveContainer" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" Apr 06 12:19:20 crc kubenswrapper[4790]: E0406 12:19:20.498889 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.462610 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.488371 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.512882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerStarted","Data":"8ca0bad48d347eac84e3eded3083ec835836e0de3466a1fd7129da202ab47c76"} Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.513000 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.526447 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.548351 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.756566448 podStartE2EDuration="4.548331028s" podCreationTimestamp="2026-04-06 12:19:17 +0000 UTC" firstStartedPulling="2026-04-06 12:19:18.431639712 +0000 UTC m=+1337.419382578" lastFinishedPulling="2026-04-06 12:19:21.223404292 +0000 UTC m=+1340.211147158" observedRunningTime="2026-04-06 12:19:21.537575398 +0000 UTC m=+1340.525318284" watchObservedRunningTime="2026-04-06 12:19:21.548331028 +0000 UTC m=+1340.536073894" Apr 06 12:19:21 crc kubenswrapper[4790]: I0406 12:19:21.688093 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" path="/var/lib/kubelet/pods/c7dbca29-a7f8-4547-9deb-75ee69b773f9/volumes" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.027878 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.239268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.240836 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.335087 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57899578c6-rh848" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.406071 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.406745 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d77d6889d-9r858" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api-log" containerID="cri-o://ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf" gracePeriod=30 Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.407101 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d77d6889d-9r858" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api" containerID="cri-o://418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217" gracePeriod=30 Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.541675 4790 generic.go:334] "Generic (PLEG): container finished" podID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerID="ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf" exitCode=143 Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.543812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerDied","Data":"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf"} Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.547979 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.644521 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.688602 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:19:22 crc kubenswrapper[4790]: I0406 12:19:22.688893 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="dnsmasq-dns" containerID="cri-o://a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8" gracePeriod=10 Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.411868 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506354 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506395 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4pdd\" (UniqueName: \"kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.506455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb\") pod \"056b7341-e602-414d-b17a-ae4a8419a741\" (UID: \"056b7341-e602-414d-b17a-ae4a8419a741\") " Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.538711 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd" (OuterVolumeSpecName: "kube-api-access-j4pdd") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "kube-api-access-j4pdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.563188 4790 generic.go:334] "Generic (PLEG): container finished" podID="056b7341-e602-414d-b17a-ae4a8419a741" containerID="a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8" exitCode=0 Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.563406 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="cinder-scheduler" containerID="cri-o://9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" gracePeriod=30 Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.563514 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.563944 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="probe" containerID="cri-o://99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" gracePeriod=30 Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.563947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" event={"ID":"056b7341-e602-414d-b17a-ae4a8419a741","Type":"ContainerDied","Data":"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8"} Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.564100 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66f4cf67b9-vprjw" event={"ID":"056b7341-e602-414d-b17a-ae4a8419a741","Type":"ContainerDied","Data":"c71e8f6a906b28d2886c6865f1828f29556ab0544a3e13fead81dbd8cfc9ea1a"} Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.564135 4790 scope.go:117] "RemoveContainer" containerID="a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.589046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.592816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.598949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.606623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.614904 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4pdd\" (UniqueName: \"kubernetes.io/projected/056b7341-e602-414d-b17a-ae4a8419a741-kube-api-access-j4pdd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.614953 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.614964 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.614985 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.614997 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.644653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config" (OuterVolumeSpecName: "config") pod "056b7341-e602-414d-b17a-ae4a8419a741" (UID: "056b7341-e602-414d-b17a-ae4a8419a741"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.716974 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/056b7341-e602-414d-b17a-ae4a8419a741-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.723057 4790 scope.go:117] "RemoveContainer" containerID="a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.767173 4790 scope.go:117] "RemoveContainer" containerID="a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8" Apr 06 12:19:23 crc kubenswrapper[4790]: E0406 12:19:23.767804 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8\": container with ID starting with a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8 not found: ID does not exist" containerID="a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.767930 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8"} err="failed to get container status \"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8\": rpc error: code = NotFound desc = could not find container \"a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8\": container with ID starting with a102cd480c0baeda696864fbb2e3a18e1c45d17b8ec70ea7bd8760a3fdd65ef8 not found: ID does not exist" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.767960 4790 scope.go:117] "RemoveContainer" containerID="a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee" Apr 06 12:19:23 crc kubenswrapper[4790]: E0406 12:19:23.770384 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee\": container with ID starting with a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee not found: ID does not exist" containerID="a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.770412 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee"} err="failed to get container status \"a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee\": rpc error: code = NotFound desc = could not find container \"a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee\": container with ID starting with a3d80ae781b016b0c877bf60d21291fdabb888491e54b84e0b5204063bac67ee not found: ID does not exist" Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.891057 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.906882 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66f4cf67b9-vprjw"] Apr 06 12:19:23 crc kubenswrapper[4790]: I0406 12:19:23.947739 4790 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode752bf76-0eff-4559-b599-6ab462cea81e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode752bf76-0eff-4559-b599-6ab462cea81e] : Timed out while waiting for systemd to remove kubepods-besteffort-pode752bf76_0eff_4559_b599_6ab462cea81e.slice" Apr 06 12:19:23 crc kubenswrapper[4790]: E0406 12:19:23.948097 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode752bf76-0eff-4559-b599-6ab462cea81e] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode752bf76-0eff-4559-b599-6ab462cea81e] : Timed out while waiting for systemd to remove kubepods-besteffort-pode752bf76_0eff_4559_b599_6ab462cea81e.slice" pod="openstack/neutron-db-sync-wl5rj" podUID="e752bf76-0eff-4559-b599-6ab462cea81e" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.056331 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d77d6889d-9r858" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:38748->10.217.0.187:9311: read: connection reset by peer" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.056360 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d77d6889d-9r858" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.187:9311/healthcheck\": read tcp 10.217.0.2:38758->10.217.0.187:9311: read: connection reset by peer" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.325497 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.325804 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.330973 4790 scope.go:117] "RemoveContainer" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" Apr 06 12:19:24 crc kubenswrapper[4790]: E0406 12:19:24.333204 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.468805 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.536445 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs\") pod \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.536590 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom\") pod \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.536628 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle\") pod \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.536742 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data\") pod \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.536776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzksp\" (UniqueName: \"kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp\") pod \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\" (UID: \"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd\") " Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.537887 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs" (OuterVolumeSpecName: "logs") pod "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" (UID: "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.544733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" (UID: "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.555811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp" (OuterVolumeSpecName: "kube-api-access-zzksp") pod "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" (UID: "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd"). InnerVolumeSpecName "kube-api-access-zzksp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.572731 4790 generic.go:334] "Generic (PLEG): container finished" podID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerID="418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217" exitCode=0 Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.572784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerDied","Data":"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217"} Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.572809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d77d6889d-9r858" event={"ID":"18f1dd4f-19c2-4cdb-a40b-6e1295700cbd","Type":"ContainerDied","Data":"485574a5d1c8d548c1e386acfa5d4a1c641a2c2bbac3063926d71e3ac2acb7f0"} Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.572840 4790 scope.go:117] "RemoveContainer" containerID="418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.572931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d77d6889d-9r858" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.574777 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wl5rj" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.595024 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" (UID: "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.634511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data" (OuterVolumeSpecName: "config-data") pod "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" (UID: "18f1dd4f-19c2-4cdb-a40b-6e1295700cbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.649060 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.649091 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.649100 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.649110 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzksp\" (UniqueName: \"kubernetes.io/projected/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-kube-api-access-zzksp\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.649119 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.651706 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.776388 4790 scope.go:117] "RemoveContainer" containerID="ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.834208 4790 scope.go:117] "RemoveContainer" containerID="418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217" Apr 06 12:19:24 crc kubenswrapper[4790]: E0406 12:19:24.834569 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217\": container with ID starting with 418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217 not found: ID does not exist" containerID="418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.834605 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217"} err="failed to get container status \"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217\": rpc error: code = NotFound desc = could not find container \"418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217\": container with ID starting with 418ba81591a495b60dc3cffe4c24110b3654891a4fe9eef9f99cb01c0ef88217 not found: ID does not exist" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.834626 4790 scope.go:117] "RemoveContainer" containerID="ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf" Apr 06 12:19:24 crc kubenswrapper[4790]: E0406 12:19:24.834856 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf\": container with ID starting with ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf not found: ID does not exist" containerID="ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.834878 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf"} err="failed to get container status \"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf\": rpc error: code = NotFound desc = could not find container \"ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf\": container with ID starting with ee0ea6ad6f87c64ba5d5a579008d6c0716c049896b7bf2515b612af015cc3caf not found: ID does not exist" Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.903373 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:19:24 crc kubenswrapper[4790]: I0406 12:19:24.921009 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d77d6889d-9r858"] Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.151696 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157393 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157650 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvsxk\" (UniqueName: \"kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk\") pod \"7462838c-5785-4c22-8fb2-e8b9304d1769\" (UID: \"7462838c-5785-4c22-8fb2-e8b9304d1769\") " Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.157814 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.158095 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7462838c-5785-4c22-8fb2-e8b9304d1769-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.165004 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk" (OuterVolumeSpecName: "kube-api-access-vvsxk") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "kube-api-access-vvsxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.165087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts" (OuterVolumeSpecName: "scripts") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.165154 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.224852 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.260066 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvsxk\" (UniqueName: \"kubernetes.io/projected/7462838c-5785-4c22-8fb2-e8b9304d1769-kube-api-access-vvsxk\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.260104 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.260113 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.260123 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.275016 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data" (OuterVolumeSpecName: "config-data") pod "7462838c-5785-4c22-8fb2-e8b9304d1769" (UID: "7462838c-5785-4c22-8fb2-e8b9304d1769"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.361623 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7462838c-5785-4c22-8fb2-e8b9304d1769-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588463 4790 generic.go:334] "Generic (PLEG): container finished" podID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerID="99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" exitCode=0 Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588496 4790 generic.go:334] "Generic (PLEG): container finished" podID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerID="9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" exitCode=0 Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerDied","Data":"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474"} Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerDied","Data":"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507"} Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7462838c-5785-4c22-8fb2-e8b9304d1769","Type":"ContainerDied","Data":"ffdfdde3e64681b14fad6230b2c1fb8f5497f0a5bb65f561ff0e5fffad9b35e8"} Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588587 4790 scope.go:117] "RemoveContainer" containerID="99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.588707 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.637944 4790 scope.go:117] "RemoveContainer" containerID="9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.654459 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.669073 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.675818 4790 scope.go:117] "RemoveContainer" containerID="99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.678967 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474\": container with ID starting with 99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474 not found: ID does not exist" containerID="99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.679010 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474"} err="failed to get container status \"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474\": rpc error: code = NotFound desc = could not find container \"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474\": container with ID starting with 99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474 not found: ID does not exist" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.679036 4790 scope.go:117] "RemoveContainer" containerID="9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.682470 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507\": container with ID starting with 9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507 not found: ID does not exist" containerID="9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.682512 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507"} err="failed to get container status \"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507\": rpc error: code = NotFound desc = could not find container \"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507\": container with ID starting with 9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507 not found: ID does not exist" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.682533 4790 scope.go:117] "RemoveContainer" containerID="99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.684953 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474"} err="failed to get container status \"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474\": rpc error: code = NotFound desc = could not find container \"99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474\": container with ID starting with 99c55c55cd75b3c48557e6eee26c89c37d82563729cd0aff4f6108c3b0eac474 not found: ID does not exist" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.684988 4790 scope.go:117] "RemoveContainer" containerID="9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.687670 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507"} err="failed to get container status \"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507\": rpc error: code = NotFound desc = could not find container \"9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507\": container with ID starting with 9d813b514a8df13c7a55cbd635c71fb5e6acc997e86e6c89d9d1664fba56c507 not found: ID does not exist" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.692570 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056b7341-e602-414d-b17a-ae4a8419a741" path="/var/lib/kubelet/pods/056b7341-e602-414d-b17a-ae4a8419a741/volumes" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.693391 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" path="/var/lib/kubelet/pods/18f1dd4f-19c2-4cdb-a40b-6e1295700cbd/volumes" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.693940 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" path="/var/lib/kubelet/pods/7462838c-5785-4c22-8fb2-e8b9304d1769/volumes" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695010 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695304 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695318 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695334 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-log" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695340 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-log" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="init" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695358 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="init" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695368 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-api" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695373 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-api" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695391 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api-log" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695397 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api-log" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695405 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="dnsmasq-dns" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695410 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="dnsmasq-dns" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695424 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="probe" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695430 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="probe" Apr 06 12:19:25 crc kubenswrapper[4790]: E0406 12:19:25.695447 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="cinder-scheduler" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695452 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="cinder-scheduler" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695623 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="cinder-scheduler" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695631 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api-log" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695646 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-log" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695656 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dbca29-a7f8-4547-9deb-75ee69b773f9" containerName="placement-api" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695673 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="056b7341-e602-414d-b17a-ae4a8419a741" containerName="dnsmasq-dns" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695682 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="18f1dd4f-19c2-4cdb-a40b-6e1295700cbd" containerName="barbican-api" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.695693 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462838c-5785-4c22-8fb2-e8b9304d1769" containerName="probe" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.697742 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.697848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.714168 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767575 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767647 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snfvq\" (UniqueName: \"kubernetes.io/projected/207e3a4b-b763-47f7-b2f7-b25c8c929af5-kube-api-access-snfvq\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767771 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/207e3a4b-b763-47f7-b2f7-b25c8c929af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.767805 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snfvq\" (UniqueName: \"kubernetes.io/projected/207e3a4b-b763-47f7-b2f7-b25c8c929af5-kube-api-access-snfvq\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/207e3a4b-b763-47f7-b2f7-b25c8c929af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.894997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/207e3a4b-b763-47f7-b2f7-b25c8c929af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.898314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.898598 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.902083 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.906419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.913673 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207e3a4b-b763-47f7-b2f7-b25c8c929af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:25 crc kubenswrapper[4790]: I0406 12:19:25.913952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snfvq\" (UniqueName: \"kubernetes.io/projected/207e3a4b-b763-47f7-b2f7-b25c8c929af5-kube-api-access-snfvq\") pod \"cinder-scheduler-0\" (UID: \"207e3a4b-b763-47f7-b2f7-b25c8c929af5\") " pod="openstack/cinder-scheduler-0" Apr 06 12:19:26 crc kubenswrapper[4790]: I0406 12:19:26.030465 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Apr 06 12:19:26 crc kubenswrapper[4790]: I0406 12:19:26.665947 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Apr 06 12:19:26 crc kubenswrapper[4790]: I0406 12:19:26.939440 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b9b7c8b58-lzkxg" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.247566 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.249132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.250904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.251461 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.252602 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qt8vv" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.258332 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.323671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.323771 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th778\" (UniqueName: \"kubernetes.io/projected/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-kube-api-access-th778\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.323915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config-secret\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.324064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.425575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.425637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th778\" (UniqueName: \"kubernetes.io/projected/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-kube-api-access-th778\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.425684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config-secret\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.425752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.426528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.430320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.431443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-openstack-config-secret\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.447482 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th778\" (UniqueName: \"kubernetes.io/projected/1eea2e2a-d9c2-46e2-96a9-827fcf5a075f-kube-api-access-th778\") pod \"openstackclient\" (UID: \"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f\") " pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.584438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.622715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"207e3a4b-b763-47f7-b2f7-b25c8c929af5","Type":"ContainerStarted","Data":"fc4abee05f355e2929412c691fc0033e76ea8dcddc3e83a6635d643e2c432b65"} Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.622759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"207e3a4b-b763-47f7-b2f7-b25c8c929af5","Type":"ContainerStarted","Data":"9bf9deea67bf927af18b70be290bfc84fafaa8c016a2d9bc560cc234b4546a2a"} Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.845291 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bbf455d7c-c2ssf" Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.922329 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.922849 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fb546dccb-rmb8d" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-api" containerID="cri-o://0a6b614aceab75ca4aa13e13b8603b058f587a20aa7333996f589a8a588baf69" gracePeriod=30 Apr 06 12:19:27 crc kubenswrapper[4790]: I0406 12:19:27.924323 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fb546dccb-rmb8d" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-httpd" containerID="cri-o://cb530553644b13d5011b4a5a55ffce32c787f90749f35bd6a2daf24ac75b4554" gracePeriod=30 Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.086942 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Apr 06 12:19:28 crc kubenswrapper[4790]: W0406 12:19:28.109703 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eea2e2a_d9c2_46e2_96a9_827fcf5a075f.slice/crio-782d91f3c63d96d7cb413efd592bcf860b483c074856ad9a591223532d3c96cf WatchSource:0}: Error finding container 782d91f3c63d96d7cb413efd592bcf860b483c074856ad9a591223532d3c96cf: Status 404 returned error can't find the container with id 782d91f3c63d96d7cb413efd592bcf860b483c074856ad9a591223532d3c96cf Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.646010 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerID="cb530553644b13d5011b4a5a55ffce32c787f90749f35bd6a2daf24ac75b4554" exitCode=0 Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.646093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerDied","Data":"cb530553644b13d5011b4a5a55ffce32c787f90749f35bd6a2daf24ac75b4554"} Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.650567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"207e3a4b-b763-47f7-b2f7-b25c8c929af5","Type":"ContainerStarted","Data":"045a06362d60636212a4b1ced8c04a0f6c1ae7d142cb36beef8f583da5dd3523"} Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.651873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f","Type":"ContainerStarted","Data":"782d91f3c63d96d7cb413efd592bcf860b483c074856ad9a591223532d3c96cf"} Apr 06 12:19:28 crc kubenswrapper[4790]: I0406 12:19:28.683875 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.683857719 podStartE2EDuration="3.683857719s" podCreationTimestamp="2026-04-06 12:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:28.6749577 +0000 UTC m=+1347.662700566" watchObservedRunningTime="2026-04-06 12:19:28.683857719 +0000 UTC m=+1347.671600585" Apr 06 12:19:31 crc kubenswrapper[4790]: I0406 12:19:31.033948 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.849443 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.849741 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-central-agent" containerID="cri-o://bafbdff85bdf83f589e0c1119e325977fcc91016504b0304123a33baafdd456a" gracePeriod=30 Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.849855 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-notification-agent" containerID="cri-o://dc4cb891f5404c0701f7a786903b19e4e32c7132230af02cec5cb9bce3d0f065" gracePeriod=30 Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.849884 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="proxy-httpd" containerID="cri-o://8ca0bad48d347eac84e3eded3083ec835836e0de3466a1fd7129da202ab47c76" gracePeriod=30 Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.849888 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="sg-core" containerID="cri-o://84cc9fd78266aa9f4f1c674114170aa87e03e297e0dbaa9d11d16e5792698feb" gracePeriod=30 Apr 06 12:19:32 crc kubenswrapper[4790]: I0406 12:19:32.858180 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.202:3000/\": EOF" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.228199 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7695db8cdc-vs5bx"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.230276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.234707 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.234963 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.235112 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.250714 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7695db8cdc-vs5bx"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.343929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-etc-swift\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344000 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnxj\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-kube-api-access-svnxj\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-run-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-combined-ca-bundle\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344145 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-public-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-log-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-config-data\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.344275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-internal-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.403108 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rnrfj"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.404405 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.419658 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4df9-account-create-update-wlsw6"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.447797 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.449947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-public-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450331 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-log-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-config-data\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-internal-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-etc-swift\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnxj\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-kube-api-access-svnxj\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-run-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.450818 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-combined-ca-bundle\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.454124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-log-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.459320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-combined-ca-bundle\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.463591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-public-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.480563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-internal-tls-certs\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.486699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7922b939-a1e8-4c85-8eb0-fe3529f6469c-config-data\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.488683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-etc-swift\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.502036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnxj\" (UniqueName: \"kubernetes.io/projected/7922b939-a1e8-4c85-8eb0-fe3529f6469c-kube-api-access-svnxj\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.505297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7922b939-a1e8-4c85-8eb0-fe3529f6469c-run-httpd\") pod \"swift-proxy-7695db8cdc-vs5bx\" (UID: \"7922b939-a1e8-4c85-8eb0-fe3529f6469c\") " pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.518458 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rnrfj"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.547848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4df9-account-create-update-wlsw6"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.554985 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.555039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f82m\" (UniqueName: \"kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.555063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr85m\" (UniqueName: \"kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.555079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.559941 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.602228 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tvmxz"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.603616 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.636687 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tvmxz"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.649390 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-v24cr"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.652226 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657284 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f82m\" (UniqueName: \"kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr85m\" (UniqueName: \"kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657379 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.657482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wv7s\" (UniqueName: \"kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.658148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.658208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v24cr"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.658350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.673746 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9d65-account-create-update-dkjm7"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.676905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.680150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr85m\" (UniqueName: \"kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m\") pod \"nova-api-4df9-account-create-update-wlsw6\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.684199 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.684445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f82m\" (UniqueName: \"kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m\") pod \"nova-api-db-create-rnrfj\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.700503 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9d65-account-create-update-dkjm7"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.734154 4790 generic.go:334] "Generic (PLEG): container finished" podID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerID="0a6b614aceab75ca4aa13e13b8603b058f587a20aa7333996f589a8a588baf69" exitCode=0 Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.734215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerDied","Data":"0a6b614aceab75ca4aa13e13b8603b058f587a20aa7333996f589a8a588baf69"} Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738776 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerID="8ca0bad48d347eac84e3eded3083ec835836e0de3466a1fd7129da202ab47c76" exitCode=0 Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738796 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerID="84cc9fd78266aa9f4f1c674114170aa87e03e297e0dbaa9d11d16e5792698feb" exitCode=2 Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738802 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerID="bafbdff85bdf83f589e0c1119e325977fcc91016504b0304123a33baafdd456a" exitCode=0 Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerDied","Data":"8ca0bad48d347eac84e3eded3083ec835836e0de3466a1fd7129da202ab47c76"} Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerDied","Data":"84cc9fd78266aa9f4f1c674114170aa87e03e297e0dbaa9d11d16e5792698feb"} Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.738846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerDied","Data":"bafbdff85bdf83f589e0c1119e325977fcc91016504b0304123a33baafdd456a"} Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.759888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wv7s\" (UniqueName: \"kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.759952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.760015 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prfwl\" (UniqueName: \"kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.760046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8pz\" (UniqueName: \"kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.760100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.760182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.760971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.782363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wv7s\" (UniqueName: \"kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s\") pod \"nova-cell0-db-create-tvmxz\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.816893 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eb17-account-create-update-64z8s"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.818311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.821131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.854978 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eb17-account-create-update-64z8s"] Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prfwl\" (UniqueName: \"kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8pz\" (UniqueName: \"kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.862419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgwl\" (UniqueName: \"kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.863211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.867085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.868166 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.883628 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8pz\" (UniqueName: \"kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz\") pod \"nova-cell1-db-create-v24cr\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.884217 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.887823 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prfwl\" (UniqueName: \"kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl\") pod \"nova-cell0-9d65-account-create-update-dkjm7\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.946079 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.964189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.964276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smgwl\" (UniqueName: \"kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.964870 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:33 crc kubenswrapper[4790]: I0406 12:19:33.971797 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:34 crc kubenswrapper[4790]: I0406 12:19:34.008400 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgwl\" (UniqueName: \"kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl\") pod \"nova-cell1-eb17-account-create-update-64z8s\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:34 crc kubenswrapper[4790]: I0406 12:19:34.169395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:34 crc kubenswrapper[4790]: I0406 12:19:34.198135 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:34 crc kubenswrapper[4790]: I0406 12:19:34.761606 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerID="dc4cb891f5404c0701f7a786903b19e4e32c7132230af02cec5cb9bce3d0f065" exitCode=0 Apr 06 12:19:34 crc kubenswrapper[4790]: I0406 12:19:34.761650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerDied","Data":"dc4cb891f5404c0701f7a786903b19e4e32c7132230af02cec5cb9bce3d0f065"} Apr 06 12:19:36 crc kubenswrapper[4790]: I0406 12:19:36.238075 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.676340 4790 scope.go:117] "RemoveContainer" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" Apr 06 12:19:38 crc kubenswrapper[4790]: E0406 12:19:38.677206 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.850464 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862190 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tx2f\" (UniqueName: \"kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862379 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862446 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.862564 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml\") pod \"5fbbdadb-2035-4420-945a-a18b0294a8da\" (UID: \"5fbbdadb-2035-4420-945a-a18b0294a8da\") " Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.874212 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.878592 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts" (OuterVolumeSpecName: "scripts") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.879077 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.914148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.915327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f" (OuterVolumeSpecName: "kube-api-access-5tx2f") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "kube-api-access-5tx2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.979635 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tx2f\" (UniqueName: \"kubernetes.io/projected/5fbbdadb-2035-4420-945a-a18b0294a8da-kube-api-access-5tx2f\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.979684 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.979695 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fbbdadb-2035-4420-945a-a18b0294a8da-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.979704 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.979712 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:38 crc kubenswrapper[4790]: W0406 12:19:38.993902 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf1c4265_b7eb_49d1_a71f_c4e5f0bcbfe8.slice/crio-c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db WatchSource:0}: Error finding container c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db: Status 404 returned error can't find the container with id c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db Apr 06 12:19:38 crc kubenswrapper[4790]: I0406 12:19:38.994581 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.008599 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rnrfj"] Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.029963 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tvmxz"] Apr 06 12:19:39 crc kubenswrapper[4790]: W0406 12:19:39.036473 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod982d3ca0_0055_4a1a_85ae_533ca695f992.slice/crio-37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e WatchSource:0}: Error finding container 37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e: Status 404 returned error can't find the container with id 37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.063155 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data" (OuterVolumeSpecName: "config-data") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.080032 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fbbdadb-2035-4420-945a-a18b0294a8da" (UID: "5fbbdadb-2035-4420-945a-a18b0294a8da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.083978 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config\") pod \"aea04d0b-6f21-4671-9b77-bbbac755b073\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.084233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdjs\" (UniqueName: \"kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs\") pod \"aea04d0b-6f21-4671-9b77-bbbac755b073\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.084298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs\") pod \"aea04d0b-6f21-4671-9b77-bbbac755b073\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.084335 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle\") pod \"aea04d0b-6f21-4671-9b77-bbbac755b073\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.084381 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config\") pod \"aea04d0b-6f21-4671-9b77-bbbac755b073\" (UID: \"aea04d0b-6f21-4671-9b77-bbbac755b073\") " Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.085142 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.085161 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbbdadb-2035-4420-945a-a18b0294a8da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.093223 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "aea04d0b-6f21-4671-9b77-bbbac755b073" (UID: "aea04d0b-6f21-4671-9b77-bbbac755b073"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.094223 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs" (OuterVolumeSpecName: "kube-api-access-8pdjs") pod "aea04d0b-6f21-4671-9b77-bbbac755b073" (UID: "aea04d0b-6f21-4671-9b77-bbbac755b073"). InnerVolumeSpecName "kube-api-access-8pdjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.176739 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config" (OuterVolumeSpecName: "config") pod "aea04d0b-6f21-4671-9b77-bbbac755b073" (UID: "aea04d0b-6f21-4671-9b77-bbbac755b073"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.178465 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "aea04d0b-6f21-4671-9b77-bbbac755b073" (UID: "aea04d0b-6f21-4671-9b77-bbbac755b073"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.186752 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.186919 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-httpd-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.186979 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.187033 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdjs\" (UniqueName: \"kubernetes.io/projected/aea04d0b-6f21-4671-9b77-bbbac755b073-kube-api-access-8pdjs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.187925 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea04d0b-6f21-4671-9b77-bbbac755b073" (UID: "aea04d0b-6f21-4671-9b77-bbbac755b073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:39 crc kubenswrapper[4790]: W0406 12:19:39.226702 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7f84efa_2483_4a30_9297_f60a74e88c75.slice/crio-057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166 WatchSource:0}: Error finding container 057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166: Status 404 returned error can't find the container with id 057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166 Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.239379 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9d65-account-create-update-dkjm7"] Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.255513 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4df9-account-create-update-wlsw6"] Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.290162 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea04d0b-6f21-4671-9b77-bbbac755b073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:39 crc kubenswrapper[4790]: W0406 12:19:39.312633 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24cee348_21ee_454e_942e_c689c059effa.slice/crio-659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7 WatchSource:0}: Error finding container 659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7: Status 404 returned error can't find the container with id 659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7 Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.453909 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-v24cr"] Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.521540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eb17-account-create-update-64z8s"] Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.604270 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7695db8cdc-vs5bx"] Apr 06 12:19:39 crc kubenswrapper[4790]: W0406 12:19:39.649571 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7922b939_a1e8_4c85_8eb0_fe3529f6469c.slice/crio-b7c7da44b680f1b6b00c7ad23add7eb25d175f8e46534bc74922d45978d3b93f WatchSource:0}: Error finding container b7c7da44b680f1b6b00c7ad23add7eb25d175f8e46534bc74922d45978d3b93f: Status 404 returned error can't find the container with id b7c7da44b680f1b6b00c7ad23add7eb25d175f8e46534bc74922d45978d3b93f Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.845628 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" event={"ID":"f7f84efa-2483-4a30-9297-f60a74e88c75","Type":"ContainerStarted","Data":"f5b3c2b9fa81166916164d1a6fea9db181b0d127ff422b62d263430737aef504"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.845669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" event={"ID":"f7f84efa-2483-4a30-9297-f60a74e88c75","Type":"ContainerStarted","Data":"057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.854549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4df9-account-create-update-wlsw6" event={"ID":"24cee348-21ee-454e-942e-c689c059effa","Type":"ContainerStarted","Data":"f41db75fa7be42824ce00db32c51b3b2bdf602d880b510f2a59f074ff7a391ec"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.854597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4df9-account-create-update-wlsw6" event={"ID":"24cee348-21ee-454e-942e-c689c059effa","Type":"ContainerStarted","Data":"659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.857700 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7695db8cdc-vs5bx" event={"ID":"7922b939-a1e8-4c85-8eb0-fe3529f6469c","Type":"ContainerStarted","Data":"b7c7da44b680f1b6b00c7ad23add7eb25d175f8e46534bc74922d45978d3b93f"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.867836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v24cr" event={"ID":"45658e80-85a4-4557-bc5d-85d86bb92f7f","Type":"ContainerStarted","Data":"116a93fe41c128e27874406ecd64911ee9581afaa4bad4c9d03b36749f23efbc"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.881570 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" podStartSLOduration=6.8815515640000005 podStartE2EDuration="6.881551564s" podCreationTimestamp="2026-04-06 12:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:39.880574878 +0000 UTC m=+1358.868317744" watchObservedRunningTime="2026-04-06 12:19:39.881551564 +0000 UTC m=+1358.869294430" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.887191 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fb546dccb-rmb8d" event={"ID":"aea04d0b-6f21-4671-9b77-bbbac755b073","Type":"ContainerDied","Data":"1dc5444a9fd1d169a5cc3dd40949da1cf74aab60c15673b0c176c7a0527dc0e7"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.887252 4790 scope.go:117] "RemoveContainer" containerID="cb530553644b13d5011b4a5a55ffce32c787f90749f35bd6a2daf24ac75b4554" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.887412 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fb546dccb-rmb8d" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.905706 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" event={"ID":"32a419b6-12dc-427a-9481-c48f7e602d54","Type":"ContainerStarted","Data":"2d5bd5c2fe9ba106f422d96ea20ad1c41bed2262351c7570053a75dba6083dbd"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.929458 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-4df9-account-create-update-wlsw6" podStartSLOduration=6.929435284 podStartE2EDuration="6.929435284s" podCreationTimestamp="2026-04-06 12:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:39.905745426 +0000 UTC m=+1358.893488292" watchObservedRunningTime="2026-04-06 12:19:39.929435284 +0000 UTC m=+1358.917178150" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.936120 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fbbdadb-2035-4420-945a-a18b0294a8da","Type":"ContainerDied","Data":"c0127ac2e84b8a5b3ba089a42a493470e03b861e20df392ae8f70655d5fa7a18"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.936277 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.952610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1eea2e2a-d9c2-46e2-96a9-827fcf5a075f","Type":"ContainerStarted","Data":"82e2fa0918cf65bf15ae7afae45b1a32fadd46705bd0afc4380b06574fea3a7d"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.973256 4790 scope.go:117] "RemoveContainer" containerID="0a6b614aceab75ca4aa13e13b8603b058f587a20aa7333996f589a8a588baf69" Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.974560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnrfj" event={"ID":"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8","Type":"ContainerStarted","Data":"c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db"} Apr 06 12:19:39 crc kubenswrapper[4790]: I0406 12:19:39.979207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tvmxz" event={"ID":"982d3ca0-0055-4a1a-85ae-533ca695f992","Type":"ContainerStarted","Data":"37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.002878 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.011191 4790 scope.go:117] "RemoveContainer" containerID="8ca0bad48d347eac84e3eded3083ec835836e0de3466a1fd7129da202ab47c76" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.020721 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fb546dccb-rmb8d"] Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.033458 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.708882291 podStartE2EDuration="13.033441757s" podCreationTimestamp="2026-04-06 12:19:27 +0000 UTC" firstStartedPulling="2026-04-06 12:19:28.111991279 +0000 UTC m=+1347.099734145" lastFinishedPulling="2026-04-06 12:19:38.436550745 +0000 UTC m=+1357.424293611" observedRunningTime="2026-04-06 12:19:39.970055379 +0000 UTC m=+1358.957798255" watchObservedRunningTime="2026-04-06 12:19:40.033441757 +0000 UTC m=+1359.021184623" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.042028 4790 scope.go:117] "RemoveContainer" containerID="84cc9fd78266aa9f4f1c674114170aa87e03e297e0dbaa9d11d16e5792698feb" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.049108 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.065950 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.070283 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-tvmxz" podStartSLOduration=7.070266329 podStartE2EDuration="7.070266329s" podCreationTimestamp="2026-04-06 12:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:40.002853033 +0000 UTC m=+1358.990595899" watchObservedRunningTime="2026-04-06 12:19:40.070266329 +0000 UTC m=+1359.058009195" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.076340 4790 scope.go:117] "RemoveContainer" containerID="dc4cb891f5404c0701f7a786903b19e4e32c7132230af02cec5cb9bce3d0f065" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.146626 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147323 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="proxy-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147336 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="proxy-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147351 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-api" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147359 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-api" Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147386 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-central-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147392 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-central-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147399 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="sg-core" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147405 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="sg-core" Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147420 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-notification-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147426 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-notification-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: E0406 12:19:40.147437 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147443 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147600 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="sg-core" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147612 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147633 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" containerName="neutron-api" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147640 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-central-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147649 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="proxy-httpd" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.147661 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" containerName="ceilometer-notification-agent" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.149593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.162695 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.162744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.195042 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.229934 4790 scope.go:117] "RemoveContainer" containerID="bafbdff85bdf83f589e0c1119e325977fcc91016504b0304123a33baafdd456a" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w866p\" (UniqueName: \"kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332604 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.332747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w866p\" (UniqueName: \"kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441376 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441512 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.441579 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.442089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.443354 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.451478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.452055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.454378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.455694 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.469115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w866p\" (UniqueName: \"kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p\") pod \"ceilometer-0\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.562337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.989943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7695db8cdc-vs5bx" event={"ID":"7922b939-a1e8-4c85-8eb0-fe3529f6469c","Type":"ContainerStarted","Data":"a26f23df24d9d34318f6398a5af2e814211ccaff3081736898d8e0de9861cfa6"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.990181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7695db8cdc-vs5bx" event={"ID":"7922b939-a1e8-4c85-8eb0-fe3529f6469c","Type":"ContainerStarted","Data":"a19a900d957de7f2450ee698886892267b9153080fd28b9c42719bd5318c3cbe"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.991247 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.991277 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.992912 4790 generic.go:334] "Generic (PLEG): container finished" podID="32a419b6-12dc-427a-9481-c48f7e602d54" containerID="a6abd7c3e971f649178b72d283380648567082547a463a3d70656f319c38aa1e" exitCode=0 Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.993052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" event={"ID":"32a419b6-12dc-427a-9481-c48f7e602d54","Type":"ContainerDied","Data":"a6abd7c3e971f649178b72d283380648567082547a463a3d70656f319c38aa1e"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.994462 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7f84efa-2483-4a30-9297-f60a74e88c75" containerID="f5b3c2b9fa81166916164d1a6fea9db181b0d127ff422b62d263430737aef504" exitCode=0 Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.994529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" event={"ID":"f7f84efa-2483-4a30-9297-f60a74e88c75","Type":"ContainerDied","Data":"f5b3c2b9fa81166916164d1a6fea9db181b0d127ff422b62d263430737aef504"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.996593 4790 generic.go:334] "Generic (PLEG): container finished" podID="45658e80-85a4-4557-bc5d-85d86bb92f7f" containerID="4fa0f11e13db269fea3ae81a854b65c857be89df17a32203cfac39908bcd84a5" exitCode=0 Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.996657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v24cr" event={"ID":"45658e80-85a4-4557-bc5d-85d86bb92f7f","Type":"ContainerDied","Data":"4fa0f11e13db269fea3ae81a854b65c857be89df17a32203cfac39908bcd84a5"} Apr 06 12:19:40 crc kubenswrapper[4790]: I0406 12:19:40.999951 4790 generic.go:334] "Generic (PLEG): container finished" podID="982d3ca0-0055-4a1a-85ae-533ca695f992" containerID="e71495423c8cb23700b72d79b089de3f471d5881c90fb89a79a65531941fe0af" exitCode=0 Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.000032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tvmxz" event={"ID":"982d3ca0-0055-4a1a-85ae-533ca695f992","Type":"ContainerDied","Data":"e71495423c8cb23700b72d79b089de3f471d5881c90fb89a79a65531941fe0af"} Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.001554 4790 generic.go:334] "Generic (PLEG): container finished" podID="df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" containerID="88eea0ac20de5a38e6001f6fb2b520abedf4aa0cd8cbce5aff3c09912ede7814" exitCode=0 Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.001633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnrfj" event={"ID":"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8","Type":"ContainerDied","Data":"88eea0ac20de5a38e6001f6fb2b520abedf4aa0cd8cbce5aff3c09912ede7814"} Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.003161 4790 generic.go:334] "Generic (PLEG): container finished" podID="24cee348-21ee-454e-942e-c689c059effa" containerID="f41db75fa7be42824ce00db32c51b3b2bdf602d880b510f2a59f074ff7a391ec" exitCode=0 Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.003213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4df9-account-create-update-wlsw6" event={"ID":"24cee348-21ee-454e-942e-c689c059effa","Type":"ContainerDied","Data":"f41db75fa7be42824ce00db32c51b3b2bdf602d880b510f2a59f074ff7a391ec"} Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.023167 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7695db8cdc-vs5bx" podStartSLOduration=8.023135642 podStartE2EDuration="8.023135642s" podCreationTimestamp="2026-04-06 12:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:41.017587324 +0000 UTC m=+1360.005330180" watchObservedRunningTime="2026-04-06 12:19:41.023135642 +0000 UTC m=+1360.010878528" Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.172870 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.689974 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbbdadb-2035-4420-945a-a18b0294a8da" path="/var/lib/kubelet/pods/5fbbdadb-2035-4420-945a-a18b0294a8da/volumes" Apr 06 12:19:41 crc kubenswrapper[4790]: I0406 12:19:41.691252 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea04d0b-6f21-4671-9b77-bbbac755b073" path="/var/lib/kubelet/pods/aea04d0b-6f21-4671-9b77-bbbac755b073/volumes" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.020734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerStarted","Data":"fa3c873addaad0b7c61cbf1b2dd256f342c9214f2ef045919d1fe60da7e914db"} Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.020786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerStarted","Data":"743e3c525c572f27fdc61a2bf133e4de2c78b45f482cd503dd24775b85432d90"} Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.020801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerStarted","Data":"4b2d037d32f29f66cf3fa63746a9255f8d71a6d2d6a41416770039eadaf3e8e1"} Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.696926 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.797790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f82m\" (UniqueName: \"kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m\") pod \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.798182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts\") pod \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\" (UID: \"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8\") " Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.801556 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" (UID: "df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.810080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m" (OuterVolumeSpecName: "kube-api-access-8f82m") pod "df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" (UID: "df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8"). InnerVolumeSpecName "kube-api-access-8f82m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.873348 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.890488 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.899516 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.900280 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.900322 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f82m\" (UniqueName: \"kubernetes.io/projected/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8-kube-api-access-8f82m\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.917580 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:42 crc kubenswrapper[4790]: I0406 12:19:42.921566 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8pz\" (UniqueName: \"kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz\") pod \"45658e80-85a4-4557-bc5d-85d86bb92f7f\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts\") pod \"24cee348-21ee-454e-942e-c689c059effa\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002274 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr85m\" (UniqueName: \"kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m\") pod \"24cee348-21ee-454e-942e-c689c059effa\" (UID: \"24cee348-21ee-454e-942e-c689c059effa\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002308 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts\") pod \"982d3ca0-0055-4a1a-85ae-533ca695f992\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts\") pod \"32a419b6-12dc-427a-9481-c48f7e602d54\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002401 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts\") pod \"f7f84efa-2483-4a30-9297-f60a74e88c75\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002453 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smgwl\" (UniqueName: \"kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl\") pod \"32a419b6-12dc-427a-9481-c48f7e602d54\" (UID: \"32a419b6-12dc-427a-9481-c48f7e602d54\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prfwl\" (UniqueName: \"kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl\") pod \"f7f84efa-2483-4a30-9297-f60a74e88c75\" (UID: \"f7f84efa-2483-4a30-9297-f60a74e88c75\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wv7s\" (UniqueName: \"kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s\") pod \"982d3ca0-0055-4a1a-85ae-533ca695f992\" (UID: \"982d3ca0-0055-4a1a-85ae-533ca695f992\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.002653 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts\") pod \"45658e80-85a4-4557-bc5d-85d86bb92f7f\" (UID: \"45658e80-85a4-4557-bc5d-85d86bb92f7f\") " Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.003659 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45658e80-85a4-4557-bc5d-85d86bb92f7f" (UID: "45658e80-85a4-4557-bc5d-85d86bb92f7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.004433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a419b6-12dc-427a-9481-c48f7e602d54" (UID: "32a419b6-12dc-427a-9481-c48f7e602d54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.004705 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "982d3ca0-0055-4a1a-85ae-533ca695f992" (UID: "982d3ca0-0055-4a1a-85ae-533ca695f992"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.004945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24cee348-21ee-454e-942e-c689c059effa" (UID: "24cee348-21ee-454e-942e-c689c059effa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.005169 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7f84efa-2483-4a30-9297-f60a74e88c75" (UID: "f7f84efa-2483-4a30-9297-f60a74e88c75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.007388 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m" (OuterVolumeSpecName: "kube-api-access-hr85m") pod "24cee348-21ee-454e-942e-c689c059effa" (UID: "24cee348-21ee-454e-942e-c689c059effa"). InnerVolumeSpecName "kube-api-access-hr85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.011617 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz" (OuterVolumeSpecName: "kube-api-access-ff8pz") pod "45658e80-85a4-4557-bc5d-85d86bb92f7f" (UID: "45658e80-85a4-4557-bc5d-85d86bb92f7f"). InnerVolumeSpecName "kube-api-access-ff8pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.011751 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl" (OuterVolumeSpecName: "kube-api-access-prfwl") pod "f7f84efa-2483-4a30-9297-f60a74e88c75" (UID: "f7f84efa-2483-4a30-9297-f60a74e88c75"). InnerVolumeSpecName "kube-api-access-prfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.012250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl" (OuterVolumeSpecName: "kube-api-access-smgwl") pod "32a419b6-12dc-427a-9481-c48f7e602d54" (UID: "32a419b6-12dc-427a-9481-c48f7e602d54"). InnerVolumeSpecName "kube-api-access-smgwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.029002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s" (OuterVolumeSpecName: "kube-api-access-4wv7s") pod "982d3ca0-0055-4a1a-85ae-533ca695f992" (UID: "982d3ca0-0055-4a1a-85ae-533ca695f992"). InnerVolumeSpecName "kube-api-access-4wv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.048531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rnrfj" event={"ID":"df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8","Type":"ContainerDied","Data":"c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.048580 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b513cb9833f40eb6f4d49776b06e084d75f18a3946d5aa5fd6d8fe8cea65db" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.048653 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rnrfj" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.058375 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" event={"ID":"32a419b6-12dc-427a-9481-c48f7e602d54","Type":"ContainerDied","Data":"2d5bd5c2fe9ba106f422d96ea20ad1c41bed2262351c7570053a75dba6083dbd"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.058413 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d5bd5c2fe9ba106f422d96ea20ad1c41bed2262351c7570053a75dba6083dbd" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.058467 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eb17-account-create-update-64z8s" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.064335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" event={"ID":"f7f84efa-2483-4a30-9297-f60a74e88c75","Type":"ContainerDied","Data":"057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.064351 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d65-account-create-update-dkjm7" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.064365 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057dca104bdc64201091cfd8450926109c34a5d22f78f9d16bb0c8bc328b8166" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.066609 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4df9-account-create-update-wlsw6" event={"ID":"24cee348-21ee-454e-942e-c689c059effa","Type":"ContainerDied","Data":"659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.066631 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="659813a2a64e5d0e94afd2ec4ad7069d6c76aad45c24f56318f4170258a579f7" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.066668 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4df9-account-create-update-wlsw6" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.072498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerStarted","Data":"a6323a547ffb77b8ebf65b4cf0d3d0f33c8de8f2b9cb25da5ea0b8e7df63daf9"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.074299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-v24cr" event={"ID":"45658e80-85a4-4557-bc5d-85d86bb92f7f","Type":"ContainerDied","Data":"116a93fe41c128e27874406ecd64911ee9581afaa4bad4c9d03b36749f23efbc"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.074327 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="116a93fe41c128e27874406ecd64911ee9581afaa4bad4c9d03b36749f23efbc" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.074374 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-v24cr" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.077572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tvmxz" event={"ID":"982d3ca0-0055-4a1a-85ae-533ca695f992","Type":"ContainerDied","Data":"37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e"} Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.077598 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fc90dd43b9f6c989ede21be214b982746861ba84ad5288d5dc9d310938372e" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.077721 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tvmxz" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105193 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45658e80-85a4-4557-bc5d-85d86bb92f7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105228 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8pz\" (UniqueName: \"kubernetes.io/projected/45658e80-85a4-4557-bc5d-85d86bb92f7f-kube-api-access-ff8pz\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105243 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24cee348-21ee-454e-942e-c689c059effa-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105255 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr85m\" (UniqueName: \"kubernetes.io/projected/24cee348-21ee-454e-942e-c689c059effa-kube-api-access-hr85m\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105269 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/982d3ca0-0055-4a1a-85ae-533ca695f992-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105279 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a419b6-12dc-427a-9481-c48f7e602d54-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105288 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7f84efa-2483-4a30-9297-f60a74e88c75-operator-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smgwl\" (UniqueName: \"kubernetes.io/projected/32a419b6-12dc-427a-9481-c48f7e602d54-kube-api-access-smgwl\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105305 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prfwl\" (UniqueName: \"kubernetes.io/projected/f7f84efa-2483-4a30-9297-f60a74e88c75-kube-api-access-prfwl\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.105314 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wv7s\" (UniqueName: \"kubernetes.io/projected/982d3ca0-0055-4a1a-85ae-533ca695f992-kube-api-access-4wv7s\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:43 crc kubenswrapper[4790]: I0406 12:19:43.939385 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:44 crc kubenswrapper[4790]: I0406 12:19:44.317528 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:44 crc kubenswrapper[4790]: I0406 12:19:44.317646 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:44 crc kubenswrapper[4790]: I0406 12:19:44.318453 4790 scope.go:117] "RemoveContainer" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.749039 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.871577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r224v\" (UniqueName: \"kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v\") pod \"6cf13bc8-7964-446c-8b42-f52e62da1ded\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.871625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle\") pod \"6cf13bc8-7964-446c-8b42-f52e62da1ded\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.871737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data\") pod \"6cf13bc8-7964-446c-8b42-f52e62da1ded\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.871807 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom\") pod \"6cf13bc8-7964-446c-8b42-f52e62da1ded\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.871891 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs\") pod \"6cf13bc8-7964-446c-8b42-f52e62da1ded\" (UID: \"6cf13bc8-7964-446c-8b42-f52e62da1ded\") " Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.873641 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs" (OuterVolumeSpecName: "logs") pod "6cf13bc8-7964-446c-8b42-f52e62da1ded" (UID: "6cf13bc8-7964-446c-8b42-f52e62da1ded"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.877972 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cf13bc8-7964-446c-8b42-f52e62da1ded" (UID: "6cf13bc8-7964-446c-8b42-f52e62da1ded"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.913063 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v" (OuterVolumeSpecName: "kube-api-access-r224v") pod "6cf13bc8-7964-446c-8b42-f52e62da1ded" (UID: "6cf13bc8-7964-446c-8b42-f52e62da1ded"). InnerVolumeSpecName "kube-api-access-r224v". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.949653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf13bc8-7964-446c-8b42-f52e62da1ded" (UID: "6cf13bc8-7964-446c-8b42-f52e62da1ded"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.952549 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data" (OuterVolumeSpecName: "config-data") pod "6cf13bc8-7964-446c-8b42-f52e62da1ded" (UID: "6cf13bc8-7964-446c-8b42-f52e62da1ded"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.974686 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r224v\" (UniqueName: \"kubernetes.io/projected/6cf13bc8-7964-446c-8b42-f52e62da1ded-kube-api-access-r224v\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.974704 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.974712 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.974720 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cf13bc8-7964-446c-8b42-f52e62da1ded-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:45 crc kubenswrapper[4790]: I0406 12:19:45.974732 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cf13bc8-7964-446c-8b42-f52e62da1ded-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.059952 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.075808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4bb\" (UniqueName: \"kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb\") pod \"54b5a9a1-f344-4b6c-9001-87271da23bdc\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.075873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle\") pod \"54b5a9a1-f344-4b6c-9001-87271da23bdc\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.075904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs\") pod \"54b5a9a1-f344-4b6c-9001-87271da23bdc\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.075944 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom\") pod \"54b5a9a1-f344-4b6c-9001-87271da23bdc\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.075993 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data\") pod \"54b5a9a1-f344-4b6c-9001-87271da23bdc\" (UID: \"54b5a9a1-f344-4b6c-9001-87271da23bdc\") " Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.077454 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs" (OuterVolumeSpecName: "logs") pod "54b5a9a1-f344-4b6c-9001-87271da23bdc" (UID: "54b5a9a1-f344-4b6c-9001-87271da23bdc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.080777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb" (OuterVolumeSpecName: "kube-api-access-8k4bb") pod "54b5a9a1-f344-4b6c-9001-87271da23bdc" (UID: "54b5a9a1-f344-4b6c-9001-87271da23bdc"). InnerVolumeSpecName "kube-api-access-8k4bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.081616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54b5a9a1-f344-4b6c-9001-87271da23bdc" (UID: "54b5a9a1-f344-4b6c-9001-87271da23bdc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.147770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data" (OuterVolumeSpecName: "config-data") pod "54b5a9a1-f344-4b6c-9001-87271da23bdc" (UID: "54b5a9a1-f344-4b6c-9001-87271da23bdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.158994 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b5a9a1-f344-4b6c-9001-87271da23bdc" (UID: "54b5a9a1-f344-4b6c-9001-87271da23bdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.178334 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.178375 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4bb\" (UniqueName: \"kubernetes.io/projected/54b5a9a1-f344-4b6c-9001-87271da23bdc-kube-api-access-8k4bb\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.178390 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.178423 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b5a9a1-f344-4b6c-9001-87271da23bdc-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.178434 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54b5a9a1-f344-4b6c-9001-87271da23bdc-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.179985 4790 generic.go:334] "Generic (PLEG): container finished" podID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerID="6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e" exitCode=137 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.180064 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-67668d4bd9-69x7l" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.180076 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerDied","Data":"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.180110 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-67668d4bd9-69x7l" event={"ID":"6cf13bc8-7964-446c-8b42-f52e62da1ded","Type":"ContainerDied","Data":"806288bce679a35ae5d8dae90f91929bc13e97a8b71eb078b250a8768b1cb64d"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.180134 4790 scope.go:117] "RemoveContainer" containerID="6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.190731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerStarted","Data":"8375b7b663468c08c1cc2e457c6d471c495128f927f247e768d0a14d14bbd558"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.191012 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-central-agent" containerID="cri-o://743e3c525c572f27fdc61a2bf133e4de2c78b45f482cd503dd24775b85432d90" gracePeriod=30 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.194087 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="sg-core" containerID="cri-o://a6323a547ffb77b8ebf65b4cf0d3d0f33c8de8f2b9cb25da5ea0b8e7df63daf9" gracePeriod=30 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.194220 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="proxy-httpd" containerID="cri-o://8375b7b663468c08c1cc2e457c6d471c495128f927f247e768d0a14d14bbd558" gracePeriod=30 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.194288 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-notification-agent" containerID="cri-o://fa3c873addaad0b7c61cbf1b2dd256f342c9214f2ef045919d1fe60da7e914db" gracePeriod=30 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.197399 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.204949 4790 generic.go:334] "Generic (PLEG): container finished" podID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerID="d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d" exitCode=137 Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.205048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerDied","Data":"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.205080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" event={"ID":"54b5a9a1-f344-4b6c-9001-87271da23bdc","Type":"ContainerDied","Data":"314ead17ba5edf2d8a905ff75d594dbb04b38d259150e7bdff190d26dcb2a9e2"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.205156 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f48bbdb8-86kwv" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.219536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791"} Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.223009 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.117091613 podStartE2EDuration="7.222973997s" podCreationTimestamp="2026-04-06 12:19:39 +0000 UTC" firstStartedPulling="2026-04-06 12:19:41.144202694 +0000 UTC m=+1360.131945560" lastFinishedPulling="2026-04-06 12:19:45.250085078 +0000 UTC m=+1364.237827944" observedRunningTime="2026-04-06 12:19:46.217423499 +0000 UTC m=+1365.205166365" watchObservedRunningTime="2026-04-06 12:19:46.222973997 +0000 UTC m=+1365.210716863" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.229983 4790 scope.go:117] "RemoveContainer" containerID="03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.253287 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.263890 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-67668d4bd9-69x7l"] Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.280173 4790 scope.go:117] "RemoveContainer" containerID="6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e" Apr 06 12:19:46 crc kubenswrapper[4790]: E0406 12:19:46.280854 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e\": container with ID starting with 6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e not found: ID does not exist" containerID="6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.280902 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e"} err="failed to get container status \"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e\": rpc error: code = NotFound desc = could not find container \"6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e\": container with ID starting with 6496814d6c8758255405629491043096ea09af755f8e76c5cf49c3d71cc5926e not found: ID does not exist" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.280930 4790 scope.go:117] "RemoveContainer" containerID="03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b" Apr 06 12:19:46 crc kubenswrapper[4790]: E0406 12:19:46.283581 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b\": container with ID starting with 03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b not found: ID does not exist" containerID="03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.283759 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b"} err="failed to get container status \"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b\": rpc error: code = NotFound desc = could not find container \"03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b\": container with ID starting with 03da9387e88a7918c3ac8e4fa6b38006695512afd8a8ff0033b81c2461d5234b not found: ID does not exist" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.283799 4790 scope.go:117] "RemoveContainer" containerID="d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.317510 4790 scope.go:117] "RemoveContainer" containerID="02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.320074 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.331301 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-78f48bbdb8-86kwv"] Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.348059 4790 scope.go:117] "RemoveContainer" containerID="d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d" Apr 06 12:19:46 crc kubenswrapper[4790]: E0406 12:19:46.349980 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d\": container with ID starting with d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d not found: ID does not exist" containerID="d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.350036 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d"} err="failed to get container status \"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d\": rpc error: code = NotFound desc = could not find container \"d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d\": container with ID starting with d26bed5812c09c9a350077d55cf0de483779ac0c690af0e50b9abaf87889478d not found: ID does not exist" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.350066 4790 scope.go:117] "RemoveContainer" containerID="02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb" Apr 06 12:19:46 crc kubenswrapper[4790]: E0406 12:19:46.355008 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb\": container with ID starting with 02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb not found: ID does not exist" containerID="02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb" Apr 06 12:19:46 crc kubenswrapper[4790]: I0406 12:19:46.355062 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb"} err="failed to get container status \"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb\": rpc error: code = NotFound desc = could not find container \"02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb\": container with ID starting with 02a0112f07904cd9f6767f4f9bc3aa190cdca19a0241653b6e970c64ef219fdb not found: ID does not exist" Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247247 4790 generic.go:334] "Generic (PLEG): container finished" podID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerID="8375b7b663468c08c1cc2e457c6d471c495128f927f247e768d0a14d14bbd558" exitCode=0 Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247530 4790 generic.go:334] "Generic (PLEG): container finished" podID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerID="a6323a547ffb77b8ebf65b4cf0d3d0f33c8de8f2b9cb25da5ea0b8e7df63daf9" exitCode=2 Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247539 4790 generic.go:334] "Generic (PLEG): container finished" podID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerID="fa3c873addaad0b7c61cbf1b2dd256f342c9214f2ef045919d1fe60da7e914db" exitCode=0 Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerDied","Data":"8375b7b663468c08c1cc2e457c6d471c495128f927f247e768d0a14d14bbd558"} Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerDied","Data":"a6323a547ffb77b8ebf65b4cf0d3d0f33c8de8f2b9cb25da5ea0b8e7df63daf9"} Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.247678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerDied","Data":"fa3c873addaad0b7c61cbf1b2dd256f342c9214f2ef045919d1fe60da7e914db"} Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.687623 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" path="/var/lib/kubelet/pods/54b5a9a1-f344-4b6c-9001-87271da23bdc/volumes" Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.692083 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" path="/var/lib/kubelet/pods/6cf13bc8-7964-446c-8b42-f52e62da1ded/volumes" Apr 06 12:19:47 crc kubenswrapper[4790]: I0406 12:19:47.864170 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.023012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hxhm\" (UniqueName: \"kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.024037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.024493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.024545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.024599 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.024684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.025135 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.025481 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs\") pod \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\" (UID: \"e618a3e1-5205-4db2-b17d-be3a2aef40b0\") " Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.025774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs" (OuterVolumeSpecName: "logs") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.029022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm" (OuterVolumeSpecName: "kube-api-access-5hxhm") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "kube-api-access-5hxhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.029170 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e618a3e1-5205-4db2-b17d-be3a2aef40b0-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.029205 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e618a3e1-5205-4db2-b17d-be3a2aef40b0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.046223 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.046295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts" (OuterVolumeSpecName: "scripts") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.063986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.096313 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data" (OuterVolumeSpecName: "config-data") pod "e618a3e1-5205-4db2-b17d-be3a2aef40b0" (UID: "e618a3e1-5205-4db2-b17d-be3a2aef40b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.130537 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.130565 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hxhm\" (UniqueName: \"kubernetes.io/projected/e618a3e1-5205-4db2-b17d-be3a2aef40b0-kube-api-access-5hxhm\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.130576 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.130585 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.130593 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e618a3e1-5205-4db2-b17d-be3a2aef40b0-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.263081 4790 generic.go:334] "Generic (PLEG): container finished" podID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerID="f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703" exitCode=137 Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.263139 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.263168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerDied","Data":"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703"} Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.263999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e618a3e1-5205-4db2-b17d-be3a2aef40b0","Type":"ContainerDied","Data":"1b2285a3a80ee49fec32a8ad0133af357b09090b866746f21b3997cab05ef521"} Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.264026 4790 scope.go:117] "RemoveContainer" containerID="f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.302715 4790 scope.go:117] "RemoveContainer" containerID="9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.324111 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.337711 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.361987 4790 scope.go:117] "RemoveContainer" containerID="f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.363408 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703\": container with ID starting with f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703 not found: ID does not exist" containerID="f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.363456 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703"} err="failed to get container status \"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703\": rpc error: code = NotFound desc = could not find container \"f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703\": container with ID starting with f497981f5f78f4487377b1be69218abab9886d404beac3e406a239f2fbb71703 not found: ID does not exist" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.363487 4790 scope.go:117] "RemoveContainer" containerID="9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.364408 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a\": container with ID starting with 9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a not found: ID does not exist" containerID="9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.364443 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a"} err="failed to get container status \"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a\": rpc error: code = NotFound desc = could not find container \"9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a\": container with ID starting with 9e89a517fad6a1ad63c7dbd097971d804ed449a2eca199b523901869b192e05a not found: ID does not exist" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.365433 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.365910 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24cee348-21ee-454e-942e-c689c059effa" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.365928 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="24cee348-21ee-454e-942e-c689c059effa" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.365947 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.365956 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.365978 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982d3ca0-0055-4a1a-85ae-533ca695f992" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.365987 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="982d3ca0-0055-4a1a-85ae-533ca695f992" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366003 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366011 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366019 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366028 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366051 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a419b6-12dc-427a-9481-c48f7e602d54" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366059 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a419b6-12dc-427a-9481-c48f7e602d54" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366080 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker-log" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366094 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f84efa-2483-4a30-9297-f60a74e88c75" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366102 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f84efa-2483-4a30-9297-f60a74e88c75" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366115 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366122 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener-log" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366137 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366144 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api-log" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366160 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45658e80-85a4-4557-bc5d-85d86bb92f7f" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366168 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="45658e80-85a4-4557-bc5d-85d86bb92f7f" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: E0406 12:19:48.366182 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366189 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366388 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366408 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f84efa-2483-4a30-9297-f60a74e88c75" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366424 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b5a9a1-f344-4b6c-9001-87271da23bdc" containerName="barbican-keystone-listener-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366437 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366448 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366464 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="45658e80-85a4-4557-bc5d-85d86bb92f7f" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366474 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a419b6-12dc-427a-9481-c48f7e602d54" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366485 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="982d3ca0-0055-4a1a-85ae-533ca695f992" containerName="mariadb-database-create" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366501 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf13bc8-7964-446c-8b42-f52e62da1ded" containerName="barbican-worker-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366517 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366527 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="24cee348-21ee-454e-942e-c689c059effa" containerName="mariadb-account-create-update" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.366545 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" containerName="cinder-api-log" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.367822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.369258 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.370041 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.370219 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.384648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data-custom\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439214 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439242 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-scripts\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439282 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9k4j\" (UniqueName: \"kubernetes.io/projected/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-kube-api-access-x9k4j\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439364 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439412 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439453 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-logs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.439506 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-logs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data-custom\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541600 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-scripts\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541652 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9k4j\" (UniqueName: \"kubernetes.io/projected/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-kube-api-access-x9k4j\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541916 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.541969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-logs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.546587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-scripts\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.547244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.548200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.548581 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data-custom\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.548888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.550658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-config-data\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.559946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9k4j\" (UniqueName: \"kubernetes.io/projected/6acbb602-fcaa-448f-bc7a-49a2ac2bb979-kube-api-access-x9k4j\") pod \"cinder-api-0\" (UID: \"6acbb602-fcaa-448f-bc7a-49a2ac2bb979\") " pod="openstack/cinder-api-0" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.569180 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.569271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7695db8cdc-vs5bx" Apr 06 12:19:48 crc kubenswrapper[4790]: I0406 12:19:48.697903 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.128737 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kbdtw"] Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.130393 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.135442 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.135914 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jnks7" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.135960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.143087 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kbdtw"] Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.182769 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.258003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.258066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.258090 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.258154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc4kq\" (UniqueName: \"kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.290698 4790 generic.go:334] "Generic (PLEG): container finished" podID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" exitCode=1 Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.290766 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791"} Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.290797 4790 scope.go:117] "RemoveContainer" containerID="625c69110057092c75a5b9f7a24594d2a63dfa0d5d8dd00096e932f7bcea9149" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.291511 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:19:49 crc kubenswrapper[4790]: E0406 12:19:49.292605 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.297687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6acbb602-fcaa-448f-bc7a-49a2ac2bb979","Type":"ContainerStarted","Data":"8de617f488493346922aac3aff8b18ccab368d20c5fa3fb53dd117bf02b6aeac"} Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.361535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.362012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.362057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc4kq\" (UniqueName: \"kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.362197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.373602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.382342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc4kq\" (UniqueName: \"kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.383164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.400519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kbdtw\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.457549 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.694107 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e618a3e1-5205-4db2-b17d-be3a2aef40b0" path="/var/lib/kubelet/pods/e618a3e1-5205-4db2-b17d-be3a2aef40b0/volumes" Apr 06 12:19:49 crc kubenswrapper[4790]: I0406 12:19:49.925901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kbdtw"] Apr 06 12:19:50 crc kubenswrapper[4790]: I0406 12:19:50.312618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" event={"ID":"ca65c701-253a-49d5-8ca5-82533b5c0995","Type":"ContainerStarted","Data":"336a6875bc93481d9da16aab12ab5498b9b0bd248722f680c0a65dd748b38953"} Apr 06 12:19:50 crc kubenswrapper[4790]: I0406 12:19:50.342441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6acbb602-fcaa-448f-bc7a-49a2ac2bb979","Type":"ContainerStarted","Data":"fc29d2f6b100a5d5c00e2866559df2f374244c31e530dce400d39956bb8b7874"} Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.372216 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6acbb602-fcaa-448f-bc7a-49a2ac2bb979","Type":"ContainerStarted","Data":"d22428eab465b304e59f9618fc80a617a6677c2c5eca1433231b614e177ae00c"} Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.373062 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.383528 4790 generic.go:334] "Generic (PLEG): container finished" podID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerID="743e3c525c572f27fdc61a2bf133e4de2c78b45f482cd503dd24775b85432d90" exitCode=0 Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.383573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerDied","Data":"743e3c525c572f27fdc61a2bf133e4de2c78b45f482cd503dd24775b85432d90"} Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.397314 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.397296775 podStartE2EDuration="3.397296775s" podCreationTimestamp="2026-04-06 12:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:19:51.392160479 +0000 UTC m=+1370.379903345" watchObservedRunningTime="2026-04-06 12:19:51.397296775 +0000 UTC m=+1370.385039641" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.494305 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612427 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w866p\" (UniqueName: \"kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data\") pod \"547e5a08-1521-4550-a258-5c36b3d61fd6\" (UID: \"547e5a08-1521-4550-a258-5c36b3d61fd6\") " Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612853 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.612949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.613891 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.613924 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/547e5a08-1521-4550-a258-5c36b3d61fd6-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.628265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p" (OuterVolumeSpecName: "kube-api-access-w866p") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "kube-api-access-w866p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.630636 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts" (OuterVolumeSpecName: "scripts") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.661986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.717008 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.717047 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w866p\" (UniqueName: \"kubernetes.io/projected/547e5a08-1521-4550-a258-5c36b3d61fd6-kube-api-access-w866p\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.717059 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.731360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data" (OuterVolumeSpecName: "config-data") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.735267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "547e5a08-1521-4550-a258-5c36b3d61fd6" (UID: "547e5a08-1521-4550-a258-5c36b3d61fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.818353 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:51 crc kubenswrapper[4790]: I0406 12:19:51.818388 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/547e5a08-1521-4550-a258-5c36b3d61fd6-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.397975 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.399030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"547e5a08-1521-4550-a258-5c36b3d61fd6","Type":"ContainerDied","Data":"4b2d037d32f29f66cf3fa63746a9255f8d71a6d2d6a41416770039eadaf3e8e1"} Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.399088 4790 scope.go:117] "RemoveContainer" containerID="8375b7b663468c08c1cc2e457c6d471c495128f927f247e768d0a14d14bbd558" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.437950 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.446495 4790 scope.go:117] "RemoveContainer" containerID="a6323a547ffb77b8ebf65b4cf0d3d0f33c8de8f2b9cb25da5ea0b8e7df63daf9" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.456366 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.475667 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:52 crc kubenswrapper[4790]: E0406 12:19:52.477790 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="sg-core" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.477823 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="sg-core" Apr 06 12:19:52 crc kubenswrapper[4790]: E0406 12:19:52.477950 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-notification-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.477963 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-notification-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: E0406 12:19:52.477983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="proxy-httpd" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.477991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="proxy-httpd" Apr 06 12:19:52 crc kubenswrapper[4790]: E0406 12:19:52.478003 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-central-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.478010 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-central-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.478428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-notification-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.478660 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="ceilometer-central-agent" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.478700 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="sg-core" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.478712 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" containerName="proxy-httpd" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.479359 4790 scope.go:117] "RemoveContainer" containerID="fa3c873addaad0b7c61cbf1b2dd256f342c9214f2ef045919d1fe60da7e914db" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.481034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.485927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.486605 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.498595 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.518107 4790 scope.go:117] "RemoveContainer" containerID="743e3c525c572f27fdc61a2bf133e4de2c78b45f482cd503dd24775b85432d90" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634718 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8skqn\" (UniqueName: \"kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634814 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634887 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.634958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.736472 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.736536 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.737253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.737301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.737377 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.737410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.737446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8skqn\" (UniqueName: \"kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.741134 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.741421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.741559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.742137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.742525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.746480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.759921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8skqn\" (UniqueName: \"kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn\") pod \"ceilometer-0\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " pod="openstack/ceilometer-0" Apr 06 12:19:52 crc kubenswrapper[4790]: I0406 12:19:52.815190 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:19:53 crc kubenswrapper[4790]: W0406 12:19:53.339134 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7385730e_1494_4372_bb15_ab0cb3de13a6.slice/crio-b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15 WatchSource:0}: Error finding container b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15: Status 404 returned error can't find the container with id b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15 Apr 06 12:19:53 crc kubenswrapper[4790]: I0406 12:19:53.349454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:53 crc kubenswrapper[4790]: I0406 12:19:53.412144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerStarted","Data":"b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15"} Apr 06 12:19:53 crc kubenswrapper[4790]: I0406 12:19:53.690578 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="547e5a08-1521-4550-a258-5c36b3d61fd6" path="/var/lib/kubelet/pods/547e5a08-1521-4550-a258-5c36b3d61fd6/volumes" Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.317663 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.318326 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.319539 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:19:54 crc kubenswrapper[4790]: E0406 12:19:54.320017 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.435703 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerStarted","Data":"c33123007803af9785014085486b050c6938d4cb14f7f3e48b2cbbee482bf980"} Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.436149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerStarted","Data":"0c1d1921a02cfda917d2ebcf1c37daad191edffc8e98fe97eae941e5275bc450"} Apr 06 12:19:54 crc kubenswrapper[4790]: I0406 12:19:54.984624 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:19:55 crc kubenswrapper[4790]: I0406 12:19:55.452307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerStarted","Data":"a2df1d9f6e637ee85ef4c1aca1d98b6533d74c7db6017194bb8d2aff5f696cd4"} Apr 06 12:19:59 crc kubenswrapper[4790]: I0406 12:19:59.168823 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:19:59 crc kubenswrapper[4790]: I0406 12:19:59.170061 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-httpd" containerID="cri-o://e8120589c7be71615c7d37a940f2c82811c6f3ddfa72a74d0e492cae992d19f7" gracePeriod=30 Apr 06 12:19:59 crc kubenswrapper[4790]: I0406 12:19:59.170347 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-log" containerID="cri-o://99c93e387d9924d9746b1f6d7dc72e0ec9e049b5f0e791e086154c30a3dc4aa2" gracePeriod=30 Apr 06 12:19:59 crc kubenswrapper[4790]: I0406 12:19:59.496080 4790 generic.go:334] "Generic (PLEG): container finished" podID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerID="99c93e387d9924d9746b1f6d7dc72e0ec9e049b5f0e791e086154c30a3dc4aa2" exitCode=143 Apr 06 12:19:59 crc kubenswrapper[4790]: I0406 12:19:59.496167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerDied","Data":"99c93e387d9924d9746b1f6d7dc72e0ec9e049b5f0e791e086154c30a3dc4aa2"} Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.145390 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591300-28l55"] Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.146552 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.149477 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.152761 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.155053 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.164334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591300-28l55"] Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.289963 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64jk4\" (UniqueName: \"kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4\") pod \"auto-csr-approver-29591300-28l55\" (UID: \"bccb2d6f-cb57-40af-96b7-fd082306b586\") " pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.391386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64jk4\" (UniqueName: \"kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4\") pod \"auto-csr-approver-29591300-28l55\" (UID: \"bccb2d6f-cb57-40af-96b7-fd082306b586\") " pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.428818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64jk4\" (UniqueName: \"kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4\") pod \"auto-csr-approver-29591300-28l55\" (UID: \"bccb2d6f-cb57-40af-96b7-fd082306b586\") " pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.463348 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.687779 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.688014 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-log" containerID="cri-o://bc308ee8c4bf787e7b9b55ae190a8a8be2d4ffcd98f5b0f51c3293eb263df632" gracePeriod=30 Apr 06 12:20:00 crc kubenswrapper[4790]: I0406 12:20:00.688126 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-httpd" containerID="cri-o://f13b247bd3378f934a4183f4ff128203e79a5ee93c4a41b1b1c4cbc535784f07" gracePeriod=30 Apr 06 12:20:01 crc kubenswrapper[4790]: I0406 12:20:01.521216 4790 generic.go:334] "Generic (PLEG): container finished" podID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerID="e8120589c7be71615c7d37a940f2c82811c6f3ddfa72a74d0e492cae992d19f7" exitCode=0 Apr 06 12:20:01 crc kubenswrapper[4790]: I0406 12:20:01.521485 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerDied","Data":"e8120589c7be71615c7d37a940f2c82811c6f3ddfa72a74d0e492cae992d19f7"} Apr 06 12:20:01 crc kubenswrapper[4790]: I0406 12:20:01.525939 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerID="bc308ee8c4bf787e7b9b55ae190a8a8be2d4ffcd98f5b0f51c3293eb263df632" exitCode=143 Apr 06 12:20:01 crc kubenswrapper[4790]: I0406 12:20:01.525986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerDied","Data":"bc308ee8c4bf787e7b9b55ae190a8a8be2d4ffcd98f5b0f51c3293eb263df632"} Apr 06 12:20:01 crc kubenswrapper[4790]: I0406 12:20:01.738025 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.298440 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455847 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455876 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6gjs\" (UniqueName: \"kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.455966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.456004 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.456046 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run\") pod \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\" (UID: \"b7fdfb74-c76e-4351-bf3e-64e035e6853f\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.456808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.457473 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs" (OuterVolumeSpecName: "logs") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.464517 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts" (OuterVolumeSpecName: "scripts") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.470059 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs" (OuterVolumeSpecName: "kube-api-access-z6gjs") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "kube-api-access-z6gjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.472879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.534226 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558464 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558503 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558518 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558530 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558545 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6gjs\" (UniqueName: \"kubernetes.io/projected/b7fdfb74-c76e-4351-bf3e-64e035e6853f-kube-api-access-z6gjs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.558558 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7fdfb74-c76e-4351-bf3e-64e035e6853f-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.575345 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data" (OuterVolumeSpecName: "config-data") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.593273 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.628359 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b7fdfb74-c76e-4351-bf3e-64e035e6853f","Type":"ContainerDied","Data":"2fdfeeb179d32149d9c703778f94b6f3b341ac938354cfd03ba5bd84ed79fbb1"} Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.628422 4790 scope.go:117] "RemoveContainer" containerID="e8120589c7be71615c7d37a940f2c82811c6f3ddfa72a74d0e492cae992d19f7" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.628575 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.647061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b7fdfb74-c76e-4351-bf3e-64e035e6853f" (UID: "b7fdfb74-c76e-4351-bf3e-64e035e6853f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.660077 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerID="f13b247bd3378f934a4183f4ff128203e79a5ee93c4a41b1b1c4cbc535784f07" exitCode=0 Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.660125 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerDied","Data":"f13b247bd3378f934a4183f4ff128203e79a5ee93c4a41b1b1c4cbc535784f07"} Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.685383 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.685606 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.685616 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fdfb74-c76e-4351-bf3e-64e035e6853f-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.747990 4790 scope.go:117] "RemoveContainer" containerID="99c93e387d9924d9746b1f6d7dc72e0ec9e049b5f0e791e086154c30a3dc4aa2" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.748229 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.891645 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.891739 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.891791 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.891848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk9zp\" (UniqueName: \"kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.892066 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.892108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.892141 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.892178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs\") pod \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\" (UID: \"7b5858b3-3d3c-4289-bd5d-79c34c777cfd\") " Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.897071 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs" (OuterVolumeSpecName: "logs") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.897508 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.913508 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591300-28l55"] Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.929962 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.930053 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts" (OuterVolumeSpecName: "scripts") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.933433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp" (OuterVolumeSpecName: "kube-api-access-rk9zp") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "kube-api-access-rk9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.961949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:03 crc kubenswrapper[4790]: I0406 12:20:03.987198 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000046 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000086 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000099 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-httpd-run\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000109 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000148 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.000159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk9zp\" (UniqueName: \"kubernetes.io/projected/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-kube-api-access-rk9zp\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.006711 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data" (OuterVolumeSpecName: "config-data") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.023356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b5858b3-3d3c-4289-bd5d-79c34c777cfd" (UID: "7b5858b3-3d3c-4289-bd5d-79c34c777cfd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.023416 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035033 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: E0406 12:20:04.035426 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035445 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: E0406 12:20:04.035476 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035483 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: E0406 12:20:04.035490 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035497 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: E0406 12:20:04.035512 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035518 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035688 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035712 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-httpd" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035722 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.035733 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" containerName="glance-log" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.036840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.044533 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.045280 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.047479 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.054644 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102427 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102529 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlwj\" (UniqueName: \"kubernetes.io/projected/3e3c46af-c7be-417d-9a92-454f74da7a82-kube-api-access-zhlwj\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102880 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102892 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.102902 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5858b3-3d3c-4289-bd5d-79c34c777cfd-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205605 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.205750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.206013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.206047 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlwj\" (UniqueName: \"kubernetes.io/projected/3e3c46af-c7be-417d-9a92-454f74da7a82-kube-api-access-zhlwj\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.206524 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.206736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.207077 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3c46af-c7be-417d-9a92-454f74da7a82-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.211195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.214126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.223418 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.223986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3c46af-c7be-417d-9a92-454f74da7a82-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.230997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlwj\" (UniqueName: \"kubernetes.io/projected/3e3c46af-c7be-417d-9a92-454f74da7a82-kube-api-access-zhlwj\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.292945 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3c46af-c7be-417d-9a92-454f74da7a82\") " pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.384788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.676762 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:04 crc kubenswrapper[4790]: E0406 12:20:04.678201 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.690857 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerStarted","Data":"a4f850013477db35327f02957caab550b30a29b39cdbcffecb80b476af2c1540"} Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.691704 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.691192 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="proxy-httpd" containerID="cri-o://a4f850013477db35327f02957caab550b30a29b39cdbcffecb80b476af2c1540" gracePeriod=30 Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.691208 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="sg-core" containerID="cri-o://a2df1d9f6e637ee85ef4c1aca1d98b6533d74c7db6017194bb8d2aff5f696cd4" gracePeriod=30 Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.691218 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-notification-agent" containerID="cri-o://c33123007803af9785014085486b050c6938d4cb14f7f3e48b2cbbee482bf980" gracePeriod=30 Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.690999 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-central-agent" containerID="cri-o://0c1d1921a02cfda917d2ebcf1c37daad191edffc8e98fe97eae941e5275bc450" gracePeriod=30 Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.705775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" event={"ID":"ca65c701-253a-49d5-8ca5-82533b5c0995","Type":"ContainerStarted","Data":"d21c59dc518f6b1f44551f6b64b48b7a7d8c6393ba5a8eff31278f1aa2cb411a"} Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.718718 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.770031607 podStartE2EDuration="12.7186985s" podCreationTimestamp="2026-04-06 12:19:52 +0000 UTC" firstStartedPulling="2026-04-06 12:19:53.355865348 +0000 UTC m=+1372.343608214" lastFinishedPulling="2026-04-06 12:20:04.304532241 +0000 UTC m=+1383.292275107" observedRunningTime="2026-04-06 12:20:04.718460603 +0000 UTC m=+1383.706203469" watchObservedRunningTime="2026-04-06 12:20:04.7186985 +0000 UTC m=+1383.706441366" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.722387 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.722621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b5858b3-3d3c-4289-bd5d-79c34c777cfd","Type":"ContainerDied","Data":"ecbc051ce53108c3ee1d425df8b86c8d52022f8a4effffac4c891c7169901c4c"} Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.722688 4790 scope.go:117] "RemoveContainer" containerID="f13b247bd3378f934a4183f4ff128203e79a5ee93c4a41b1b1c4cbc535784f07" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.728238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591300-28l55" event={"ID":"bccb2d6f-cb57-40af-96b7-fd082306b586","Type":"ContainerStarted","Data":"28384ecf405e7217fd5a68e07f9328cbdd26b2dd407a88da20f690adbaa6661a"} Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.743730 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" podStartSLOduration=2.365043635 podStartE2EDuration="15.743712173s" podCreationTimestamp="2026-04-06 12:19:49 +0000 UTC" firstStartedPulling="2026-04-06 12:19:49.933117909 +0000 UTC m=+1368.920860775" lastFinishedPulling="2026-04-06 12:20:03.311786447 +0000 UTC m=+1382.299529313" observedRunningTime="2026-04-06 12:20:04.732948756 +0000 UTC m=+1383.720691622" watchObservedRunningTime="2026-04-06 12:20:04.743712173 +0000 UTC m=+1383.731455039" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.785959 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.810902 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.841442 4790 scope.go:117] "RemoveContainer" containerID="bc308ee8c4bf787e7b9b55ae190a8a8be2d4ffcd98f5b0f51c3293eb263df632" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.855035 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.856650 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.858574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.858806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Apr 06 12:20:04 crc kubenswrapper[4790]: I0406 12:20:04.880724 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029766 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.029961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.030034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.030087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/7f86cbe3-945a-4c2a-8986-aa0443e28b95-kube-api-access-qd7mm\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.058677 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Apr 06 12:20:05 crc kubenswrapper[4790]: W0406 12:20:05.077186 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3c46af_c7be_417d_9a92_454f74da7a82.slice/crio-e46c5c8753815aadc0063f0636da89c2cc4f8432e83a8314e8614df429708074 WatchSource:0}: Error finding container e46c5c8753815aadc0063f0636da89c2cc4f8432e83a8314e8614df429708074: Status 404 returned error can't find the container with id e46c5c8753815aadc0063f0636da89c2cc4f8432e83a8314e8614df429708074 Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/7f86cbe3-945a-4c2a-8986-aa0443e28b95-kube-api-access-qd7mm\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133382 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.133551 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.134667 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.135457 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.139509 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f86cbe3-945a-4c2a-8986-aa0443e28b95-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.141773 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.151041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.151414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.155595 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd7mm\" (UniqueName: \"kubernetes.io/projected/7f86cbe3-945a-4c2a-8986-aa0443e28b95-kube-api-access-qd7mm\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.159147 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86cbe3-945a-4c2a-8986-aa0443e28b95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.183265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f86cbe3-945a-4c2a-8986-aa0443e28b95\") " pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.194421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.689218 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5858b3-3d3c-4289-bd5d-79c34c777cfd" path="/var/lib/kubelet/pods/7b5858b3-3d3c-4289-bd5d-79c34c777cfd/volumes" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.690481 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7fdfb74-c76e-4351-bf3e-64e035e6853f" path="/var/lib/kubelet/pods/b7fdfb74-c76e-4351-bf3e-64e035e6853f/volumes" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.750715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591300-28l55" event={"ID":"bccb2d6f-cb57-40af-96b7-fd082306b586","Type":"ContainerStarted","Data":"f135f668806711d8774c07c8dfa5aa505ac2a38daa00191c7e57cccdc4082867"} Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.782202 4790 generic.go:334] "Generic (PLEG): container finished" podID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerID="a2df1d9f6e637ee85ef4c1aca1d98b6533d74c7db6017194bb8d2aff5f696cd4" exitCode=2 Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.782235 4790 generic.go:334] "Generic (PLEG): container finished" podID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerID="0c1d1921a02cfda917d2ebcf1c37daad191edffc8e98fe97eae941e5275bc450" exitCode=0 Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.782313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerDied","Data":"a2df1d9f6e637ee85ef4c1aca1d98b6533d74c7db6017194bb8d2aff5f696cd4"} Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.782385 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerDied","Data":"0c1d1921a02cfda917d2ebcf1c37daad191edffc8e98fe97eae941e5275bc450"} Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.783520 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591300-28l55" podStartSLOduration=4.740684287 podStartE2EDuration="5.783495789s" podCreationTimestamp="2026-04-06 12:20:00 +0000 UTC" firstStartedPulling="2026-04-06 12:20:03.889636832 +0000 UTC m=+1382.877379698" lastFinishedPulling="2026-04-06 12:20:04.932448334 +0000 UTC m=+1383.920191200" observedRunningTime="2026-04-06 12:20:05.771319292 +0000 UTC m=+1384.759062158" watchObservedRunningTime="2026-04-06 12:20:05.783495789 +0000 UTC m=+1384.771238655" Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.788421 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3c46af-c7be-417d-9a92-454f74da7a82","Type":"ContainerStarted","Data":"e46c5c8753815aadc0063f0636da89c2cc4f8432e83a8314e8614df429708074"} Apr 06 12:20:05 crc kubenswrapper[4790]: I0406 12:20:05.848364 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.823445 4790 generic.go:334] "Generic (PLEG): container finished" podID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerID="c33123007803af9785014085486b050c6938d4cb14f7f3e48b2cbbee482bf980" exitCode=0 Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.823548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerDied","Data":"c33123007803af9785014085486b050c6938d4cb14f7f3e48b2cbbee482bf980"} Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.825951 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f86cbe3-945a-4c2a-8986-aa0443e28b95","Type":"ContainerStarted","Data":"143fcb7b72d64c57ff895f899b6fbb369e21252c3ed4394555d125307e64fe2e"} Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.826006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f86cbe3-945a-4c2a-8986-aa0443e28b95","Type":"ContainerStarted","Data":"e227af6bc98bcc06f02de0f369e71165e3923b992f7761a45c605be49b6a6e2d"} Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.828962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3c46af-c7be-417d-9a92-454f74da7a82","Type":"ContainerStarted","Data":"0142754df2e8a8ac45fbdaa3cc60bfbef4643afffb0f8b3ba181a554fdb8d663"} Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.830887 4790 generic.go:334] "Generic (PLEG): container finished" podID="bccb2d6f-cb57-40af-96b7-fd082306b586" containerID="f135f668806711d8774c07c8dfa5aa505ac2a38daa00191c7e57cccdc4082867" exitCode=0 Apr 06 12:20:06 crc kubenswrapper[4790]: I0406 12:20:06.830913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591300-28l55" event={"ID":"bccb2d6f-cb57-40af-96b7-fd082306b586","Type":"ContainerDied","Data":"f135f668806711d8774c07c8dfa5aa505ac2a38daa00191c7e57cccdc4082867"} Apr 06 12:20:07 crc kubenswrapper[4790]: I0406 12:20:07.840362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f86cbe3-945a-4c2a-8986-aa0443e28b95","Type":"ContainerStarted","Data":"35fe839f66761abf1c8357e5901d1719bace1eb44bccf8c01f2740dc646443c7"} Apr 06 12:20:07 crc kubenswrapper[4790]: I0406 12:20:07.843077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3c46af-c7be-417d-9a92-454f74da7a82","Type":"ContainerStarted","Data":"4af1d1730e6801ccbee64a6d013ceef88e6e709f8f6b9e3683c0719eea43b450"} Apr 06 12:20:07 crc kubenswrapper[4790]: I0406 12:20:07.897367 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.897349538 podStartE2EDuration="3.897349538s" podCreationTimestamp="2026-04-06 12:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:07.879083107 +0000 UTC m=+1386.866825973" watchObservedRunningTime="2026-04-06 12:20:07.897349538 +0000 UTC m=+1386.885092404" Apr 06 12:20:07 crc kubenswrapper[4790]: I0406 12:20:07.904977 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.904956085 podStartE2EDuration="4.904956085s" podCreationTimestamp="2026-04-06 12:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:07.895081523 +0000 UTC m=+1386.882824389" watchObservedRunningTime="2026-04-06 12:20:07.904956085 +0000 UTC m=+1386.892698951" Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.284044 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.405130 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64jk4\" (UniqueName: \"kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4\") pod \"bccb2d6f-cb57-40af-96b7-fd082306b586\" (UID: \"bccb2d6f-cb57-40af-96b7-fd082306b586\") " Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.412376 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4" (OuterVolumeSpecName: "kube-api-access-64jk4") pod "bccb2d6f-cb57-40af-96b7-fd082306b586" (UID: "bccb2d6f-cb57-40af-96b7-fd082306b586"). InnerVolumeSpecName "kube-api-access-64jk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.507479 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64jk4\" (UniqueName: \"kubernetes.io/projected/bccb2d6f-cb57-40af-96b7-fd082306b586-kube-api-access-64jk4\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.840100 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591294-25dml"] Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.848492 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591294-25dml"] Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.857468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591300-28l55" event={"ID":"bccb2d6f-cb57-40af-96b7-fd082306b586","Type":"ContainerDied","Data":"28384ecf405e7217fd5a68e07f9328cbdd26b2dd407a88da20f690adbaa6661a"} Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.857530 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28384ecf405e7217fd5a68e07f9328cbdd26b2dd407a88da20f690adbaa6661a" Apr 06 12:20:08 crc kubenswrapper[4790]: I0406 12:20:08.857527 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591300-28l55" Apr 06 12:20:09 crc kubenswrapper[4790]: I0406 12:20:09.686566 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6d2231-f726-41d3-b2ff-1bae2b437000" path="/var/lib/kubelet/pods/bb6d2231-f726-41d3-b2ff-1bae2b437000/volumes" Apr 06 12:20:10 crc kubenswrapper[4790]: I0406 12:20:10.474740 4790 scope.go:117] "RemoveContainer" containerID="8e977830766d57bcc1fe1f162f2b966a9e00e277c56ed8f6d3583bcc714a0077" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.317580 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.318762 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:14 crc kubenswrapper[4790]: E0406 12:20:14.319015 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.319505 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.385508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.385567 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.417698 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.428593 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.914100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.914143 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Apr 06 12:20:14 crc kubenswrapper[4790]: I0406 12:20:14.914770 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:14 crc kubenswrapper[4790]: E0406 12:20:14.915042 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.195399 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.195918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.234695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.248059 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.923263 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:15 crc kubenswrapper[4790]: I0406 12:20:15.924065 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:16 crc kubenswrapper[4790]: I0406 12:20:16.666687 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 06 12:20:16 crc kubenswrapper[4790]: I0406 12:20:16.720378 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Apr 06 12:20:17 crc kubenswrapper[4790]: I0406 12:20:17.807202 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:17 crc kubenswrapper[4790]: I0406 12:20:17.825776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Apr 06 12:20:18 crc kubenswrapper[4790]: I0406 12:20:18.955091 4790 generic.go:334] "Generic (PLEG): container finished" podID="ca65c701-253a-49d5-8ca5-82533b5c0995" containerID="d21c59dc518f6b1f44551f6b64b48b7a7d8c6393ba5a8eff31278f1aa2cb411a" exitCode=0 Apr 06 12:20:18 crc kubenswrapper[4790]: I0406 12:20:18.955119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" event={"ID":"ca65c701-253a-49d5-8ca5-82533b5c0995","Type":"ContainerDied","Data":"d21c59dc518f6b1f44551f6b64b48b7a7d8c6393ba5a8eff31278f1aa2cb411a"} Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.347482 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.466517 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle\") pod \"ca65c701-253a-49d5-8ca5-82533b5c0995\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.466574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data\") pod \"ca65c701-253a-49d5-8ca5-82533b5c0995\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.466717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc4kq\" (UniqueName: \"kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq\") pod \"ca65c701-253a-49d5-8ca5-82533b5c0995\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.466914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts\") pod \"ca65c701-253a-49d5-8ca5-82533b5c0995\" (UID: \"ca65c701-253a-49d5-8ca5-82533b5c0995\") " Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.472631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts" (OuterVolumeSpecName: "scripts") pod "ca65c701-253a-49d5-8ca5-82533b5c0995" (UID: "ca65c701-253a-49d5-8ca5-82533b5c0995"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.472934 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq" (OuterVolumeSpecName: "kube-api-access-qc4kq") pod "ca65c701-253a-49d5-8ca5-82533b5c0995" (UID: "ca65c701-253a-49d5-8ca5-82533b5c0995"). InnerVolumeSpecName "kube-api-access-qc4kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.500066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data" (OuterVolumeSpecName: "config-data") pod "ca65c701-253a-49d5-8ca5-82533b5c0995" (UID: "ca65c701-253a-49d5-8ca5-82533b5c0995"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.506788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca65c701-253a-49d5-8ca5-82533b5c0995" (UID: "ca65c701-253a-49d5-8ca5-82533b5c0995"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.569454 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc4kq\" (UniqueName: \"kubernetes.io/projected/ca65c701-253a-49d5-8ca5-82533b5c0995-kube-api-access-qc4kq\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.569489 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.569502 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.569512 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca65c701-253a-49d5-8ca5-82533b5c0995-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.980763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" event={"ID":"ca65c701-253a-49d5-8ca5-82533b5c0995","Type":"ContainerDied","Data":"336a6875bc93481d9da16aab12ab5498b9b0bd248722f680c0a65dd748b38953"} Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.981117 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336a6875bc93481d9da16aab12ab5498b9b0bd248722f680c0a65dd748b38953" Apr 06 12:20:20 crc kubenswrapper[4790]: I0406 12:20:20.980848 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kbdtw" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.101345 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:21 crc kubenswrapper[4790]: E0406 12:20:21.101913 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccb2d6f-cb57-40af-96b7-fd082306b586" containerName="oc" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.101940 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccb2d6f-cb57-40af-96b7-fd082306b586" containerName="oc" Apr 06 12:20:21 crc kubenswrapper[4790]: E0406 12:20:21.101977 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca65c701-253a-49d5-8ca5-82533b5c0995" containerName="nova-cell0-conductor-db-sync" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.101997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca65c701-253a-49d5-8ca5-82533b5c0995" containerName="nova-cell0-conductor-db-sync" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.102256 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca65c701-253a-49d5-8ca5-82533b5c0995" containerName="nova-cell0-conductor-db-sync" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.102287 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccb2d6f-cb57-40af-96b7-fd082306b586" containerName="oc" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.103088 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.105615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.106025 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jnks7" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.115405 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.180548 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.180653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrz5r\" (UniqueName: \"kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.180674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.283540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.283661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrz5r\" (UniqueName: \"kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.283699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.288644 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.289959 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.299645 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrz5r\" (UniqueName: \"kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r\") pod \"nova-cell0-conductor-0\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.421988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.862882 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:21 crc kubenswrapper[4790]: W0406 12:20:21.864042 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc00065f6_b85e_4c34_b245_b13a517522b5.slice/crio-d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a WatchSource:0}: Error finding container d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a: Status 404 returned error can't find the container with id d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a Apr 06 12:20:21 crc kubenswrapper[4790]: I0406 12:20:21.991722 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c00065f6-b85e-4c34-b245-b13a517522b5","Type":"ContainerStarted","Data":"d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a"} Apr 06 12:20:22 crc kubenswrapper[4790]: I0406 12:20:22.820243 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 06 12:20:22 crc kubenswrapper[4790]: I0406 12:20:22.924131 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:23 crc kubenswrapper[4790]: I0406 12:20:23.001307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c00065f6-b85e-4c34-b245-b13a517522b5","Type":"ContainerStarted","Data":"6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a"} Apr 06 12:20:23 crc kubenswrapper[4790]: I0406 12:20:23.001396 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:23 crc kubenswrapper[4790]: I0406 12:20:23.024202 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.024185958 podStartE2EDuration="2.024185958s" podCreationTimestamp="2026-04-06 12:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:23.020630566 +0000 UTC m=+1402.008373442" watchObservedRunningTime="2026-04-06 12:20:23.024185958 +0000 UTC m=+1402.011928824" Apr 06 12:20:24 crc kubenswrapper[4790]: I0406 12:20:24.011081 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" gracePeriod=30 Apr 06 12:20:25 crc kubenswrapper[4790]: I0406 12:20:25.675951 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:25 crc kubenswrapper[4790]: E0406 12:20:25.676546 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(bbabffd5-cc02-4c40-adec-25c11548ed44)\"" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" Apr 06 12:20:31 crc kubenswrapper[4790]: E0406 12:20:31.424093 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:31 crc kubenswrapper[4790]: E0406 12:20:31.425846 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:31 crc kubenswrapper[4790]: E0406 12:20:31.427250 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:31 crc kubenswrapper[4790]: E0406 12:20:31.427301 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.128516 4790 generic.go:334] "Generic (PLEG): container finished" podID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerID="a4f850013477db35327f02957caab550b30a29b39cdbcffecb80b476af2c1540" exitCode=137 Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.128564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerDied","Data":"a4f850013477db35327f02957caab550b30a29b39cdbcffecb80b476af2c1540"} Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.129067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7385730e-1494-4372-bb15-ab0cb3de13a6","Type":"ContainerDied","Data":"b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15"} Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.129082 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b792bfe51b5547cf149faa2edbb7468a049c602a86761f628d3bb11366a49f15" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.162380 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270322 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270429 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270497 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270536 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8skqn\" (UniqueName: \"kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270602 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.270655 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml\") pod \"7385730e-1494-4372-bb15-ab0cb3de13a6\" (UID: \"7385730e-1494-4372-bb15-ab0cb3de13a6\") " Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.271648 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.272259 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.276060 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn" (OuterVolumeSpecName: "kube-api-access-8skqn") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "kube-api-access-8skqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.277608 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts" (OuterVolumeSpecName: "scripts") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.323534 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.370009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.372961 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.373103 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.375296 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.375423 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.375506 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8skqn\" (UniqueName: \"kubernetes.io/projected/7385730e-1494-4372-bb15-ab0cb3de13a6-kube-api-access-8skqn\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.375586 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7385730e-1494-4372-bb15-ab0cb3de13a6-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.390190 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data" (OuterVolumeSpecName: "config-data") pod "7385730e-1494-4372-bb15-ab0cb3de13a6" (UID: "7385730e-1494-4372-bb15-ab0cb3de13a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:35 crc kubenswrapper[4790]: I0406 12:20:35.477128 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7385730e-1494-4372-bb15-ab0cb3de13a6-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.137434 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.166298 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.179765 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.189589 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.190037 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-notification-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190061 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-notification-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.190076 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="sg-core" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190083 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="sg-core" Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.190098 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-central-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190103 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-central-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.190123 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="proxy-httpd" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190129 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="proxy-httpd" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190300 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="sg-core" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190319 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="proxy-httpd" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190337 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-notification-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.190348 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" containerName="ceilometer-central-agent" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.192072 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.194204 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.196893 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.213154 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skn6\" (UniqueName: \"kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291525 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.291941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.292091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.393875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.393947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.393999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skn6\" (UniqueName: \"kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.394577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.401026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.401803 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.405473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.408929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.429339 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.430984 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.432295 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:36 crc kubenswrapper[4790]: E0406 12:20:36.432341 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.438912 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skn6\" (UniqueName: \"kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6\") pod \"ceilometer-0\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " pod="openstack/ceilometer-0" Apr 06 12:20:36 crc kubenswrapper[4790]: I0406 12:20:36.512453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:20:37 crc kubenswrapper[4790]: I0406 12:20:37.021494 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:20:37 crc kubenswrapper[4790]: I0406 12:20:37.147529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerStarted","Data":"dc9bc6f32547a260f6f8540bb338d37ea1a78ba805c4a692d53073d495df4042"} Apr 06 12:20:37 crc kubenswrapper[4790]: I0406 12:20:37.691750 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7385730e-1494-4372-bb15-ab0cb3de13a6" path="/var/lib/kubelet/pods/7385730e-1494-4372-bb15-ab0cb3de13a6/volumes" Apr 06 12:20:38 crc kubenswrapper[4790]: I0406 12:20:38.164805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerStarted","Data":"571317242b82d964f97bc0861956ca6e7b71a5f8a05f70dfdc5becb2f16cf83e"} Apr 06 12:20:38 crc kubenswrapper[4790]: I0406 12:20:38.164928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerStarted","Data":"d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7"} Apr 06 12:20:38 crc kubenswrapper[4790]: I0406 12:20:38.674983 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:39 crc kubenswrapper[4790]: I0406 12:20:39.180074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerStarted","Data":"d7ebac945324e1d3970bd05a7428bf42aa33864abe32dc70e1fd5a5d20fb496c"} Apr 06 12:20:39 crc kubenswrapper[4790]: I0406 12:20:39.182409 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerStarted","Data":"c858255e7277056de6c1035e9a9f27119140998e9f1caf9f8b871e347259a8f5"} Apr 06 12:20:41 crc kubenswrapper[4790]: I0406 12:20:41.202006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerStarted","Data":"af26e769ba3a86afe83a7de61d1ef7c66c0dd157c41639008c693bec62c0085b"} Apr 06 12:20:41 crc kubenswrapper[4790]: I0406 12:20:41.202536 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:20:41 crc kubenswrapper[4790]: I0406 12:20:41.232732 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8573425879999999 podStartE2EDuration="5.232710814s" podCreationTimestamp="2026-04-06 12:20:36 +0000 UTC" firstStartedPulling="2026-04-06 12:20:37.018752678 +0000 UTC m=+1416.006495544" lastFinishedPulling="2026-04-06 12:20:40.394120864 +0000 UTC m=+1419.381863770" observedRunningTime="2026-04-06 12:20:41.221570707 +0000 UTC m=+1420.209313573" watchObservedRunningTime="2026-04-06 12:20:41.232710814 +0000 UTC m=+1420.220453680" Apr 06 12:20:41 crc kubenswrapper[4790]: E0406 12:20:41.425181 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:41 crc kubenswrapper[4790]: E0406 12:20:41.427162 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:41 crc kubenswrapper[4790]: E0406 12:20:41.428381 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:41 crc kubenswrapper[4790]: E0406 12:20:41.428409 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:44 crc kubenswrapper[4790]: I0406 12:20:44.317944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:44 crc kubenswrapper[4790]: I0406 12:20:44.318309 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:44 crc kubenswrapper[4790]: I0406 12:20:44.348662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:45 crc kubenswrapper[4790]: I0406 12:20:45.268441 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:45 crc kubenswrapper[4790]: I0406 12:20:45.312689 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:46 crc kubenswrapper[4790]: E0406 12:20:46.425036 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:46 crc kubenswrapper[4790]: E0406 12:20:46.426218 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:46 crc kubenswrapper[4790]: E0406 12:20:46.427738 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:46 crc kubenswrapper[4790]: E0406 12:20:46.427782 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.258483 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" containerID="cri-o://c858255e7277056de6c1035e9a9f27119140998e9f1caf9f8b871e347259a8f5" gracePeriod=30 Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.465524 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.465742 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" containerID="cri-o://4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" gracePeriod=30 Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.492132 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.492369 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api-log" containerID="cri-o://a4e7b78989ccf895b8d0eb92bad17ac9d8dfaa46bb07c48104540a7db8c71e8f" gracePeriod=30 Apr 06 12:20:47 crc kubenswrapper[4790]: I0406 12:20:47.492443 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api" containerID="cri-o://d645708a60606639082464824ac049745ed311edacd34096b43dbb1aeacada33" gracePeriod=30 Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.270046 4790 generic.go:334] "Generic (PLEG): container finished" podID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerID="d645708a60606639082464824ac049745ed311edacd34096b43dbb1aeacada33" exitCode=0 Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.270330 4790 generic.go:334] "Generic (PLEG): container finished" podID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerID="a4e7b78989ccf895b8d0eb92bad17ac9d8dfaa46bb07c48104540a7db8c71e8f" exitCode=143 Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.270352 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerDied","Data":"d645708a60606639082464824ac049745ed311edacd34096b43dbb1aeacada33"} Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.270379 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerDied","Data":"a4e7b78989ccf895b8d0eb92bad17ac9d8dfaa46bb07c48104540a7db8c71e8f"} Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.406510 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526405 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526481 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526549 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526617 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526683 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t56hp\" (UniqueName: \"kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.526722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.528217 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs" (OuterVolumeSpecName: "logs") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.550230 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp" (OuterVolumeSpecName: "kube-api-access-t56hp") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "kube-api-access-t56hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.566964 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.569504 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.585769 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: E0406 12:20:48.592203 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs podName:e95ab128-e1ba-4110-8a41-cf5975e3655d nodeName:}" failed. No retries permitted until 2026-04-06 12:20:49.092177151 +0000 UTC m=+1428.079920017 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d") : error deleting /var/lib/kubelet/pods/e95ab128-e1ba-4110-8a41-cf5975e3655d/volume-subpaths: remove /var/lib/kubelet/pods/e95ab128-e1ba-4110-8a41-cf5975e3655d/volume-subpaths: no such file or directory Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.594917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data" (OuterVolumeSpecName: "config-data") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629194 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629229 4790 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629239 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629247 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629255 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e95ab128-e1ba-4110-8a41-cf5975e3655d-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:48 crc kubenswrapper[4790]: I0406 12:20:48.629263 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t56hp\" (UniqueName: \"kubernetes.io/projected/e95ab128-e1ba-4110-8a41-cf5975e3655d-kube-api-access-t56hp\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.138750 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") pod \"e95ab128-e1ba-4110-8a41-cf5975e3655d\" (UID: \"e95ab128-e1ba-4110-8a41-cf5975e3655d\") " Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.145627 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e95ab128-e1ba-4110-8a41-cf5975e3655d" (UID: "e95ab128-e1ba-4110-8a41-cf5975e3655d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.222573 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.224013 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.225542 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.225661 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.241843 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e95ab128-e1ba-4110-8a41-cf5975e3655d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.282482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e95ab128-e1ba-4110-8a41-cf5975e3655d","Type":"ContainerDied","Data":"55931a5f66d163a71d214d04d4a1a799488e45d8e8f8ce56a01f943850e38dd4"} Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.282758 4790 scope.go:117] "RemoveContainer" containerID="d645708a60606639082464824ac049745ed311edacd34096b43dbb1aeacada33" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.282533 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.306646 4790 scope.go:117] "RemoveContainer" containerID="a4e7b78989ccf895b8d0eb92bad17ac9d8dfaa46bb07c48104540a7db8c71e8f" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.323475 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.335498 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.344947 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.345346 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api-log" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.345363 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api-log" Apr 06 12:20:49 crc kubenswrapper[4790]: E0406 12:20:49.345377 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.345384 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.345567 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.345596 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" containerName="watcher-api-log" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.346679 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.349262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.349608 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.349805 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.361331 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.446500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b614b0dd-0285-4907-8e74-051e3ef0b3a1-logs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.446583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-public-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.446839 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.447038 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.447066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.447089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-config-data\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.447253 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjg7d\" (UniqueName: \"kubernetes.io/projected/b614b0dd-0285-4907-8e74-051e3ef0b3a1-kube-api-access-fjg7d\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549244 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-config-data\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549287 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjg7d\" (UniqueName: \"kubernetes.io/projected/b614b0dd-0285-4907-8e74-051e3ef0b3a1-kube-api-access-fjg7d\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549341 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b614b0dd-0285-4907-8e74-051e3ef0b3a1-logs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-public-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.549897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b614b0dd-0285-4907-8e74-051e3ef0b3a1-logs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.553917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-public-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.553934 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.555748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.556092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.560076 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b614b0dd-0285-4907-8e74-051e3ef0b3a1-config-data\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.571623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjg7d\" (UniqueName: \"kubernetes.io/projected/b614b0dd-0285-4907-8e74-051e3ef0b3a1-kube-api-access-fjg7d\") pod \"watcher-api-0\" (UID: \"b614b0dd-0285-4907-8e74-051e3ef0b3a1\") " pod="openstack/watcher-api-0" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.684999 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95ab128-e1ba-4110-8a41-cf5975e3655d" path="/var/lib/kubelet/pods/e95ab128-e1ba-4110-8a41-cf5975e3655d/volumes" Apr 06 12:20:49 crc kubenswrapper[4790]: I0406 12:20:49.703360 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Apr 06 12:20:50 crc kubenswrapper[4790]: I0406 12:20:50.151553 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Apr 06 12:20:50 crc kubenswrapper[4790]: I0406 12:20:50.293209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b614b0dd-0285-4907-8e74-051e3ef0b3a1","Type":"ContainerStarted","Data":"9dadedabc197e1fc0660c6d02cb0651f8e8e16c4f515ea9883647988527af5fe"} Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.304135 4790 generic.go:334] "Generic (PLEG): container finished" podID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerID="c858255e7277056de6c1035e9a9f27119140998e9f1caf9f8b871e347259a8f5" exitCode=0 Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.304232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"c858255e7277056de6c1035e9a9f27119140998e9f1caf9f8b871e347259a8f5"} Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.305440 4790 scope.go:117] "RemoveContainer" containerID="c6f0a86d90b7a2f0c0b3e361d551993ba37e5efa12ce4db035f185d73e7fa791" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.307773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b614b0dd-0285-4907-8e74-051e3ef0b3a1","Type":"ContainerStarted","Data":"8a1724945008090257f1fab27170aea0e60322044332cc64ee7dd9a502d8e029"} Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.307876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b614b0dd-0285-4907-8e74-051e3ef0b3a1","Type":"ContainerStarted","Data":"6f83c3e40d82a612712a9ed7c63799b0704af8bfd8c5746a15a49e915fe3c82c"} Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.308196 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.333028 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.333011375 podStartE2EDuration="2.333011375s" podCreationTimestamp="2026-04-06 12:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:51.331296725 +0000 UTC m=+1430.319039601" watchObservedRunningTime="2026-04-06 12:20:51.333011375 +0000 UTC m=+1430.320754241" Apr 06 12:20:51 crc kubenswrapper[4790]: E0406 12:20:51.424439 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:51 crc kubenswrapper[4790]: E0406 12:20:51.425864 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:51 crc kubenswrapper[4790]: E0406 12:20:51.427140 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Apr 06 12:20:51 crc kubenswrapper[4790]: E0406 12:20:51.427167 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.686740 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.790996 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs\") pod \"bbabffd5-cc02-4c40-adec-25c11548ed44\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca\") pod \"bbabffd5-cc02-4c40-adec-25c11548ed44\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data\") pod \"bbabffd5-cc02-4c40-adec-25c11548ed44\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791128 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle\") pod \"bbabffd5-cc02-4c40-adec-25c11548ed44\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791170 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hsp4\" (UniqueName: \"kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4\") pod \"bbabffd5-cc02-4c40-adec-25c11548ed44\" (UID: \"bbabffd5-cc02-4c40-adec-25c11548ed44\") " Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs" (OuterVolumeSpecName: "logs") pod "bbabffd5-cc02-4c40-adec-25c11548ed44" (UID: "bbabffd5-cc02-4c40-adec-25c11548ed44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.791761 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbabffd5-cc02-4c40-adec-25c11548ed44-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.795665 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4" (OuterVolumeSpecName: "kube-api-access-4hsp4") pod "bbabffd5-cc02-4c40-adec-25c11548ed44" (UID: "bbabffd5-cc02-4c40-adec-25c11548ed44"). InnerVolumeSpecName "kube-api-access-4hsp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.819683 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "bbabffd5-cc02-4c40-adec-25c11548ed44" (UID: "bbabffd5-cc02-4c40-adec-25c11548ed44"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.821569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbabffd5-cc02-4c40-adec-25c11548ed44" (UID: "bbabffd5-cc02-4c40-adec-25c11548ed44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.859204 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data" (OuterVolumeSpecName: "config-data") pod "bbabffd5-cc02-4c40-adec-25c11548ed44" (UID: "bbabffd5-cc02-4c40-adec-25c11548ed44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.893593 4790 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.893625 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.893634 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbabffd5-cc02-4c40-adec-25c11548ed44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:51 crc kubenswrapper[4790]: I0406 12:20:51.893644 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hsp4\" (UniqueName: \"kubernetes.io/projected/bbabffd5-cc02-4c40-adec-25c11548ed44-kube-api-access-4hsp4\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.318042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"bbabffd5-cc02-4c40-adec-25c11548ed44","Type":"ContainerDied","Data":"9e2e46c72f21b2c5702c75e5df201bc0002b8c687efef600f4cdddb10bddda76"} Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.318102 4790 scope.go:117] "RemoveContainer" containerID="c858255e7277056de6c1035e9a9f27119140998e9f1caf9f8b871e347259a8f5" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.318101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.361131 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.373469 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.385276 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:52 crc kubenswrapper[4790]: E0406 12:20:52.385707 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.385726 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: E0406 12:20:52.385739 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.385745 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: E0406 12:20:52.385753 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.385758 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: E0406 12:20:52.385777 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.385783 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.386000 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.386019 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.386031 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.386048 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.386642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.388775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.406532 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.503276 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.503338 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2c5\" (UniqueName: \"kubernetes.io/projected/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-kube-api-access-4l2c5\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.503373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.503430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.503509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-logs\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.605413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.605494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2c5\" (UniqueName: \"kubernetes.io/projected/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-kube-api-access-4l2c5\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.605534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.605580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.605693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-logs\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.606764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-logs\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.612068 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.617225 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.625258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.633308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2c5\" (UniqueName: \"kubernetes.io/projected/6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4-kube-api-access-4l2c5\") pod \"watcher-decision-engine-0\" (UID: \"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4\") " pod="openstack/watcher-decision-engine-0" Apr 06 12:20:52 crc kubenswrapper[4790]: I0406 12:20:52.712140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Apr 06 12:20:53 crc kubenswrapper[4790]: I0406 12:20:53.227426 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Apr 06 12:20:53 crc kubenswrapper[4790]: W0406 12:20:53.230998 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f7f1a8b_0f45_447c_8cef_e0701c1ce1e4.slice/crio-7a410bf44943a249afdaa96d041c0d053ecce4c58c685ca31e7cdc1e622a9b7a WatchSource:0}: Error finding container 7a410bf44943a249afdaa96d041c0d053ecce4c58c685ca31e7cdc1e622a9b7a: Status 404 returned error can't find the container with id 7a410bf44943a249afdaa96d041c0d053ecce4c58c685ca31e7cdc1e622a9b7a Apr 06 12:20:53 crc kubenswrapper[4790]: I0406 12:20:53.329704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4","Type":"ContainerStarted","Data":"7a410bf44943a249afdaa96d041c0d053ecce4c58c685ca31e7cdc1e622a9b7a"} Apr 06 12:20:53 crc kubenswrapper[4790]: I0406 12:20:53.586004 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:20:53 crc kubenswrapper[4790]: I0406 12:20:53.687913 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" path="/var/lib/kubelet/pods/bbabffd5-cc02-4c40-adec-25c11548ed44/volumes" Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.223900 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.225551 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.227397 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.227428 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.354502 4790 generic.go:334] "Generic (PLEG): container finished" podID="c00065f6-b85e-4c34-b245-b13a517522b5" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" exitCode=137 Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.354569 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c00065f6-b85e-4c34-b245-b13a517522b5","Type":"ContainerDied","Data":"6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a"} Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.354931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c00065f6-b85e-4c34-b245-b13a517522b5","Type":"ContainerDied","Data":"d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a"} Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.354948 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d431a2afb7f6e244170dda42cd93805d45e80bd38e2cd72994b35c799066f97a" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.356601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4","Type":"ContainerStarted","Data":"5becf3efaf2136c17587642641bd5b54e0a3093950a8010e8d2163c7f0422314"} Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.383551 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.3835262520000002 podStartE2EDuration="2.383526252s" podCreationTimestamp="2026-04-06 12:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:54.372663576 +0000 UTC m=+1433.360406442" watchObservedRunningTime="2026-04-06 12:20:54.383526252 +0000 UTC m=+1433.371269118" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.439775 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.575142 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data\") pod \"c00065f6-b85e-4c34-b245-b13a517522b5\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.575230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrz5r\" (UniqueName: \"kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r\") pod \"c00065f6-b85e-4c34-b245-b13a517522b5\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.575291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle\") pod \"c00065f6-b85e-4c34-b245-b13a517522b5\" (UID: \"c00065f6-b85e-4c34-b245-b13a517522b5\") " Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.580616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r" (OuterVolumeSpecName: "kube-api-access-lrz5r") pod "c00065f6-b85e-4c34-b245-b13a517522b5" (UID: "c00065f6-b85e-4c34-b245-b13a517522b5"). InnerVolumeSpecName "kube-api-access-lrz5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.613640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data" (OuterVolumeSpecName: "config-data") pod "c00065f6-b85e-4c34-b245-b13a517522b5" (UID: "c00065f6-b85e-4c34-b245-b13a517522b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.615201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c00065f6-b85e-4c34-b245-b13a517522b5" (UID: "c00065f6-b85e-4c34-b245-b13a517522b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.665644 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.666223 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.666260 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:54 crc kubenswrapper[4790]: E0406 12:20:54.666297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.666305 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.666609 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" containerName="nova-cell0-conductor-conductor" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.666630 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbabffd5-cc02-4c40-adec-25c11548ed44" containerName="watcher-decision-engine" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.668110 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.677272 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.677301 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrz5r\" (UniqueName: \"kubernetes.io/projected/c00065f6-b85e-4c34-b245-b13a517522b5-kube-api-access-lrz5r\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.677314 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c00065f6-b85e-4c34-b245-b13a517522b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.680241 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.703636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.778835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.778967 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtdfk\" (UniqueName: \"kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.779192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.880486 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.880560 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.880581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtdfk\" (UniqueName: \"kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.881566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.881616 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:54 crc kubenswrapper[4790]: I0406 12:20:54.900869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtdfk\" (UniqueName: \"kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk\") pod \"redhat-marketplace-5jd2n\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.016300 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.365640 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.402782 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.428637 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.440675 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.441967 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.444476 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.444720 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jnks7" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.459675 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.522447 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.602426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.602579 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.602672 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtg52\" (UniqueName: \"kubernetes.io/projected/001ce2db-6829-4dcc-bf3a-b19134cd3484-kube-api-access-xtg52\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.690071 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00065f6-b85e-4c34-b245-b13a517522b5" path="/var/lib/kubelet/pods/c00065f6-b85e-4c34-b245-b13a517522b5/volumes" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.705031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.705147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtg52\" (UniqueName: \"kubernetes.io/projected/001ce2db-6829-4dcc-bf3a-b19134cd3484-kube-api-access-xtg52\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.705208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.712080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.712660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001ce2db-6829-4dcc-bf3a-b19134cd3484-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.727606 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtg52\" (UniqueName: \"kubernetes.io/projected/001ce2db-6829-4dcc-bf3a-b19134cd3484-kube-api-access-xtg52\") pod \"nova-cell0-conductor-0\" (UID: \"001ce2db-6829-4dcc-bf3a-b19134cd3484\") " pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.766781 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:55 crc kubenswrapper[4790]: I0406 12:20:55.902167 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:20:55 crc kubenswrapper[4790]: E0406 12:20:55.974922 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1faae3f1_4b30_4273_baa5_8b2790ddff2a.slice/crio-4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1faae3f1_4b30_4273_baa5_8b2790ddff2a.slice/crio-conmon-4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.009359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs\") pod \"b60a0169-fd9c-4677-92e1-a09453bea104\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.009849 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs" (OuterVolumeSpecName: "logs") pod "b60a0169-fd9c-4677-92e1-a09453bea104" (UID: "b60a0169-fd9c-4677-92e1-a09453bea104"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.009995 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data\") pod \"b60a0169-fd9c-4677-92e1-a09453bea104\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.010044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlcxr\" (UniqueName: \"kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr\") pod \"b60a0169-fd9c-4677-92e1-a09453bea104\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.010096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle\") pod \"b60a0169-fd9c-4677-92e1-a09453bea104\" (UID: \"b60a0169-fd9c-4677-92e1-a09453bea104\") " Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.010522 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b60a0169-fd9c-4677-92e1-a09453bea104-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.014563 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr" (OuterVolumeSpecName: "kube-api-access-jlcxr") pod "b60a0169-fd9c-4677-92e1-a09453bea104" (UID: "b60a0169-fd9c-4677-92e1-a09453bea104"). InnerVolumeSpecName "kube-api-access-jlcxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.059126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b60a0169-fd9c-4677-92e1-a09453bea104" (UID: "b60a0169-fd9c-4677-92e1-a09453bea104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.094880 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data" (OuterVolumeSpecName: "config-data") pod "b60a0169-fd9c-4677-92e1-a09453bea104" (UID: "b60a0169-fd9c-4677-92e1-a09453bea104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.113481 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.113515 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlcxr\" (UniqueName: \"kubernetes.io/projected/b60a0169-fd9c-4677-92e1-a09453bea104-kube-api-access-jlcxr\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.113527 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b60a0169-fd9c-4677-92e1-a09453bea104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.265573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.378355 4790 generic.go:334] "Generic (PLEG): container finished" podID="b60a0169-fd9c-4677-92e1-a09453bea104" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" exitCode=0 Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.378409 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b60a0169-fd9c-4677-92e1-a09453bea104","Type":"ContainerDied","Data":"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0"} Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.378435 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.378467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b60a0169-fd9c-4677-92e1-a09453bea104","Type":"ContainerDied","Data":"2ff8ddf6ef5d52fcb2fc99c1215c9e47621bfd4aad0de8da6953b85be94ee520"} Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.378489 4790 scope.go:117] "RemoveContainer" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.380342 4790 generic.go:334] "Generic (PLEG): container finished" podID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerID="4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f" exitCode=0 Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.380380 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerDied","Data":"4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f"} Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.380404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerStarted","Data":"23d738b30744556c57fad05a95e4aee4bf1f2f17bb2dcded425c5d9217b901a6"} Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.384327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"001ce2db-6829-4dcc-bf3a-b19134cd3484","Type":"ContainerStarted","Data":"bae140a57fb8af7a5b1fc485755080deb869efcab97445f1ff117c4727f97f7b"} Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.414359 4790 scope.go:117] "RemoveContainer" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" Apr 06 12:20:56 crc kubenswrapper[4790]: E0406 12:20:56.415711 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0\": container with ID starting with 4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0 not found: ID does not exist" containerID="4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.415766 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0"} err="failed to get container status \"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0\": rpc error: code = NotFound desc = could not find container \"4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0\": container with ID starting with 4c835ea311bfaf2c6361390cfb9878065b110ed7e38ae493596cd7cfc84462b0 not found: ID does not exist" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.443772 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.459088 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.466736 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:56 crc kubenswrapper[4790]: E0406 12:20:56.467344 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.467432 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.467793 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" containerName="watcher-applier" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.468661 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.472207 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.494608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.623365 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bed4cbed-be09-43cd-938a-e4a1fe5fe399-logs\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.623468 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-config-data\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.623553 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjhc\" (UniqueName: \"kubernetes.io/projected/bed4cbed-be09-43cd-938a-e4a1fe5fe399-kube-api-access-7mjhc\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.623604 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.724905 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.724977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bed4cbed-be09-43cd-938a-e4a1fe5fe399-logs\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.725031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-config-data\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.725099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjhc\" (UniqueName: \"kubernetes.io/projected/bed4cbed-be09-43cd-938a-e4a1fe5fe399-kube-api-access-7mjhc\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.725889 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bed4cbed-be09-43cd-938a-e4a1fe5fe399-logs\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.730726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-config-data\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.731764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed4cbed-be09-43cd-938a-e4a1fe5fe399-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.763006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjhc\" (UniqueName: \"kubernetes.io/projected/bed4cbed-be09-43cd-938a-e4a1fe5fe399-kube-api-access-7mjhc\") pod \"watcher-applier-0\" (UID: \"bed4cbed-be09-43cd-938a-e4a1fe5fe399\") " pod="openstack/watcher-applier-0" Apr 06 12:20:56 crc kubenswrapper[4790]: I0406 12:20:56.802976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.287057 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Apr 06 12:20:57 crc kubenswrapper[4790]: W0406 12:20:57.297038 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbed4cbed_be09_43cd_938a_e4a1fe5fe399.slice/crio-089859aaa864c4ee93de0fd3a233a13a2f426475a6f018421ff13f3906200910 WatchSource:0}: Error finding container 089859aaa864c4ee93de0fd3a233a13a2f426475a6f018421ff13f3906200910: Status 404 returned error can't find the container with id 089859aaa864c4ee93de0fd3a233a13a2f426475a6f018421ff13f3906200910 Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.394688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"001ce2db-6829-4dcc-bf3a-b19134cd3484","Type":"ContainerStarted","Data":"b1e0cffd4dc6dcb287fe11bf9054ef3794d35d171db1e88dbfd669ecce933c7b"} Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.395873 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.398308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"bed4cbed-be09-43cd-938a-e4a1fe5fe399","Type":"ContainerStarted","Data":"089859aaa864c4ee93de0fd3a233a13a2f426475a6f018421ff13f3906200910"} Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.422807 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.4227883119999998 podStartE2EDuration="2.422788312s" podCreationTimestamp="2026-04-06 12:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:57.411128123 +0000 UTC m=+1436.398871009" watchObservedRunningTime="2026-04-06 12:20:57.422788312 +0000 UTC m=+1436.410531178" Apr 06 12:20:57 crc kubenswrapper[4790]: I0406 12:20:57.713421 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60a0169-fd9c-4677-92e1-a09453bea104" path="/var/lib/kubelet/pods/b60a0169-fd9c-4677-92e1-a09453bea104/volumes" Apr 06 12:20:58 crc kubenswrapper[4790]: I0406 12:20:58.410887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"bed4cbed-be09-43cd-938a-e4a1fe5fe399","Type":"ContainerStarted","Data":"7be6b6609ed2359d7583252794cc05a2f85444ba566ef54b4c9a78f6c7b59ebb"} Apr 06 12:20:58 crc kubenswrapper[4790]: I0406 12:20:58.413106 4790 generic.go:334] "Generic (PLEG): container finished" podID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerID="f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234" exitCode=0 Apr 06 12:20:58 crc kubenswrapper[4790]: I0406 12:20:58.414327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerDied","Data":"f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234"} Apr 06 12:20:58 crc kubenswrapper[4790]: I0406 12:20:58.435387 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.435368761 podStartE2EDuration="2.435368761s" podCreationTimestamp="2026-04-06 12:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:20:58.429439338 +0000 UTC m=+1437.417182214" watchObservedRunningTime="2026-04-06 12:20:58.435368761 +0000 UTC m=+1437.423111627" Apr 06 12:20:59 crc kubenswrapper[4790]: I0406 12:20:59.423500 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerStarted","Data":"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611"} Apr 06 12:20:59 crc kubenswrapper[4790]: I0406 12:20:59.464296 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5jd2n" podStartSLOduration=2.991435781 podStartE2EDuration="5.464275924s" podCreationTimestamp="2026-04-06 12:20:54 +0000 UTC" firstStartedPulling="2026-04-06 12:20:56.381972292 +0000 UTC m=+1435.369715158" lastFinishedPulling="2026-04-06 12:20:58.854812435 +0000 UTC m=+1437.842555301" observedRunningTime="2026-04-06 12:20:59.450385951 +0000 UTC m=+1438.438128857" watchObservedRunningTime="2026-04-06 12:20:59.464275924 +0000 UTC m=+1438.452018790" Apr 06 12:20:59 crc kubenswrapper[4790]: I0406 12:20:59.704760 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Apr 06 12:20:59 crc kubenswrapper[4790]: I0406 12:20:59.712369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Apr 06 12:21:00 crc kubenswrapper[4790]: I0406 12:21:00.450348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Apr 06 12:21:01 crc kubenswrapper[4790]: I0406 12:21:01.803988 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Apr 06 12:21:02 crc kubenswrapper[4790]: I0406 12:21:02.712612 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Apr 06 12:21:02 crc kubenswrapper[4790]: I0406 12:21:02.738296 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Apr 06 12:21:03 crc kubenswrapper[4790]: I0406 12:21:03.470316 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Apr 06 12:21:03 crc kubenswrapper[4790]: I0406 12:21:03.503040 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.016400 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.016771 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.062865 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.578172 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.643881 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:21:05 crc kubenswrapper[4790]: I0406 12:21:05.816778 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.435690 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-s89q4"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.437490 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.440228 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.440643 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.446232 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s89q4"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.543982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.544045 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.544107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cj9n\" (UniqueName: \"kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.544160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.555721 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.557261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.561355 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.585433 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.633799 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.635270 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.642276 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.642692 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.645615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.645720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.645764 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cj9n\" (UniqueName: \"kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.645865 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xksj\" (UniqueName: \"kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.645898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.646044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.646106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.654660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.657603 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.658559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.681334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cj9n\" (UniqueName: \"kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n\") pod \"nova-cell0-cell-mapping-s89q4\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.686061 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.747750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcbv6\" (UniqueName: \"kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.747819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.747889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.747926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.747953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.748021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xksj\" (UniqueName: \"kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.761218 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.762813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.763887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.765751 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.783932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.796920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.803354 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.809418 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.848336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xksj\" (UniqueName: \"kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj\") pod \"nova-cell1-novncproxy-0\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849377 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcbv6\" (UniqueName: \"kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849427 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849483 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849536 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849588 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.849613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnm85\" (UniqueName: \"kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.873585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.879086 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.892713 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.894344 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.899462 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.900510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcbv6\" (UniqueName: \"kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6\") pod \"nova-scheduler-0\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.906453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.939143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.945151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.950956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.951001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.951024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.951057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnm85\" (UniqueName: \"kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.953185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.957415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.963680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.969487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.970612 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.976919 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.977530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnm85\" (UniqueName: \"kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85\") pod \"nova-api-0\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " pod="openstack/nova-api-0" Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.987606 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:06 crc kubenswrapper[4790]: I0406 12:21:06.989784 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.053568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.053992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfr2s\" (UniqueName: \"kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054230 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054262 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054376 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.054443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8h5\" (UniqueName: \"kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.155628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfr2s\" (UniqueName: \"kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.155938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.155961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.155997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8h5\" (UniqueName: \"kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.156188 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.157981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.158286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.158639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.159328 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.159534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.160156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.162626 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.165537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.181376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8h5\" (UniqueName: \"kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5\") pod \"nova-metadata-0\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.183597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfr2s\" (UniqueName: \"kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s\") pod \"dnsmasq-dns-6b7b5ff8cc-fpp68\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.303379 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.328559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.494814 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s89q4"] Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.558868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s89q4" event={"ID":"6ff20ebb-f795-4d0c-8836-ede867897d49","Type":"ContainerStarted","Data":"8133276b0e5e826fb08cf8352390c205394048a35c2ebb5f7f7d4bc4c667f50b"} Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.559013 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5jd2n" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="registry-server" containerID="cri-o://d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611" gracePeriod=2 Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.581655 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.601275 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.626717 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.755016 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtb78"] Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.756150 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.761746 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtb78"] Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.764022 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.764056 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.818507 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:07 crc kubenswrapper[4790]: W0406 12:21:07.828440 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76bc1936_4c86_461a_a992_12927ef262e4.slice/crio-a966fe9544322bc7ab47a481beacfbc9a5b46978187914b9e5498b0b88cde670 WatchSource:0}: Error finding container a966fe9544322bc7ab47a481beacfbc9a5b46978187914b9e5498b0b88cde670: Status 404 returned error can't find the container with id a966fe9544322bc7ab47a481beacfbc9a5b46978187914b9e5498b0b88cde670 Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.874803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.874869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.874982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.875004 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzqz\" (UniqueName: \"kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.976669 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.976721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzqz\" (UniqueName: \"kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.976785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.976840 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.985646 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.986075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:07 crc kubenswrapper[4790]: I0406 12:21:07.993390 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.003884 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzqz\" (UniqueName: \"kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz\") pod \"nova-cell1-conductor-db-sync-dtb78\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.113914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.178981 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.252250 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.298368 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:08 crc kubenswrapper[4790]: W0406 12:21:08.305800 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec42c47_2162_48fc_abfc_03537b0f2e1a.slice/crio-e81eddd4ec8db71f3e580d87cade25660e08eec7ef6796d245d5fde2f942cb51 WatchSource:0}: Error finding container e81eddd4ec8db71f3e580d87cade25660e08eec7ef6796d245d5fde2f942cb51: Status 404 returned error can't find the container with id e81eddd4ec8db71f3e580d87cade25660e08eec7ef6796d245d5fde2f942cb51 Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.399539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content\") pod \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.399601 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities\") pod \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.400623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities" (OuterVolumeSpecName: "utilities") pod "1faae3f1-4b30-4273-baa5-8b2790ddff2a" (UID: "1faae3f1-4b30-4273-baa5-8b2790ddff2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.400705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtdfk\" (UniqueName: \"kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk\") pod \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\" (UID: \"1faae3f1-4b30-4273-baa5-8b2790ddff2a\") " Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.401333 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.407682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk" (OuterVolumeSpecName: "kube-api-access-mtdfk") pod "1faae3f1-4b30-4273-baa5-8b2790ddff2a" (UID: "1faae3f1-4b30-4273-baa5-8b2790ddff2a"). InnerVolumeSpecName "kube-api-access-mtdfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.429457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1faae3f1-4b30-4273-baa5-8b2790ddff2a" (UID: "1faae3f1-4b30-4273-baa5-8b2790ddff2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.503775 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faae3f1-4b30-4273-baa5-8b2790ddff2a-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.503810 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtdfk\" (UniqueName: \"kubernetes.io/projected/1faae3f1-4b30-4273-baa5-8b2790ddff2a-kube-api-access-mtdfk\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.578572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerStarted","Data":"ffad5549e66dcb7a6954a8be0288a899a7ba99f5007608b259bf2463ab986e48"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.606330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" event={"ID":"5ec42c47-2162-48fc-abfc-03537b0f2e1a","Type":"ContainerStarted","Data":"e81eddd4ec8db71f3e580d87cade25660e08eec7ef6796d245d5fde2f942cb51"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.616137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerStarted","Data":"4cc83bcfec1df8c3759b580d54d984031bce15646c0901a0b1193d3b3f3698aa"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.619053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76bc1936-4c86-461a-a992-12927ef262e4","Type":"ContainerStarted","Data":"a966fe9544322bc7ab47a481beacfbc9a5b46978187914b9e5498b0b88cde670"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.621700 4790 generic.go:334] "Generic (PLEG): container finished" podID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerID="d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611" exitCode=0 Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.621780 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerDied","Data":"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.621800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5jd2n" event={"ID":"1faae3f1-4b30-4273-baa5-8b2790ddff2a","Type":"ContainerDied","Data":"23d738b30744556c57fad05a95e4aee4bf1f2f17bb2dcded425c5d9217b901a6"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.621817 4790 scope.go:117] "RemoveContainer" containerID="d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.621968 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5jd2n" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.631874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4","Type":"ContainerStarted","Data":"3376922f262a60e682e630f456a1c9d9d2aa31270ca979c2430f91eca683d9c3"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.634916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s89q4" event={"ID":"6ff20ebb-f795-4d0c-8836-ede867897d49","Type":"ContainerStarted","Data":"37d97592f7406c9d4fc98603e8b17536cd78e6d94bea0809d43be773c95b9706"} Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.674681 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-s89q4" podStartSLOduration=2.674658346 podStartE2EDuration="2.674658346s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:08.663324496 +0000 UTC m=+1447.651067362" watchObservedRunningTime="2026-04-06 12:21:08.674658346 +0000 UTC m=+1447.662401212" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.680087 4790 scope.go:117] "RemoveContainer" containerID="f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.809382 4790 scope.go:117] "RemoveContainer" containerID="4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f" Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.816973 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.827281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5jd2n"] Apr 06 12:21:08 crc kubenswrapper[4790]: I0406 12:21:08.836184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtb78"] Apr 06 12:21:08 crc kubenswrapper[4790]: W0406 12:21:08.853414 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1f21193_0d83_4045_9359_ec1228ed6d34.slice/crio-ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4 WatchSource:0}: Error finding container ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4: Status 404 returned error can't find the container with id ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4 Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.031007 4790 scope.go:117] "RemoveContainer" containerID="d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611" Apr 06 12:21:09 crc kubenswrapper[4790]: E0406 12:21:09.031898 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611\": container with ID starting with d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611 not found: ID does not exist" containerID="d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.031937 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611"} err="failed to get container status \"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611\": rpc error: code = NotFound desc = could not find container \"d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611\": container with ID starting with d8303ac77b4a87b696e26089f2350f760a0b2ecc16944952d90b05048be4a611 not found: ID does not exist" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.031979 4790 scope.go:117] "RemoveContainer" containerID="f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234" Apr 06 12:21:09 crc kubenswrapper[4790]: E0406 12:21:09.034037 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234\": container with ID starting with f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234 not found: ID does not exist" containerID="f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.034096 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234"} err="failed to get container status \"f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234\": rpc error: code = NotFound desc = could not find container \"f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234\": container with ID starting with f1f5dd544ab2ec68c9bf33a1bd71ecbda6332d3cea929d29c31fdabea20bd234 not found: ID does not exist" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.034125 4790 scope.go:117] "RemoveContainer" containerID="4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f" Apr 06 12:21:09 crc kubenswrapper[4790]: E0406 12:21:09.034584 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f\": container with ID starting with 4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f not found: ID does not exist" containerID="4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.034628 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f"} err="failed to get container status \"4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f\": rpc error: code = NotFound desc = could not find container \"4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f\": container with ID starting with 4b2cc559c3da1317e4962888d6caf217b01c0904290090bf913c32caf168224f not found: ID does not exist" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.655243 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtb78" event={"ID":"b1f21193-0d83-4045-9359-ec1228ed6d34","Type":"ContainerStarted","Data":"f73fd47ecb6bb249b4c9063e0e5e2661ae112feb32e11555a04df2d7062365a4"} Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.655579 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtb78" event={"ID":"b1f21193-0d83-4045-9359-ec1228ed6d34","Type":"ContainerStarted","Data":"ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4"} Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.657061 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerID="5205ca765bf34eb54da7c711160ed115c542923289970ad2157eb7217a693170" exitCode=0 Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.657113 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" event={"ID":"5ec42c47-2162-48fc-abfc-03537b0f2e1a","Type":"ContainerDied","Data":"5205ca765bf34eb54da7c711160ed115c542923289970ad2157eb7217a693170"} Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.687569 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dtb78" podStartSLOduration=2.687553074 podStartE2EDuration="2.687553074s" podCreationTimestamp="2026-04-06 12:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:09.678167301 +0000 UTC m=+1448.665910177" watchObservedRunningTime="2026-04-06 12:21:09.687553074 +0000 UTC m=+1448.675295940" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.690168 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" path="/var/lib/kubelet/pods/1faae3f1-4b30-4273-baa5-8b2790ddff2a/volumes" Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.753664 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:21:09 crc kubenswrapper[4790]: I0406 12:21:09.753977 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:21:10 crc kubenswrapper[4790]: I0406 12:21:10.655125 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:10 crc kubenswrapper[4790]: I0406 12:21:10.701747 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:11 crc kubenswrapper[4790]: I0406 12:21:11.748054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" event={"ID":"5ec42c47-2162-48fc-abfc-03537b0f2e1a","Type":"ContainerStarted","Data":"147f43dc636c6b37d86785519d3fcf698b9667dc90f96241a2733289c2e4b9c2"} Apr 06 12:21:11 crc kubenswrapper[4790]: I0406 12:21:11.748654 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:11 crc kubenswrapper[4790]: I0406 12:21:11.813028 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" podStartSLOduration=5.8130108069999995 podStartE2EDuration="5.813010807s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:11.80692013 +0000 UTC m=+1450.794662996" watchObservedRunningTime="2026-04-06 12:21:11.813010807 +0000 UTC m=+1450.800753673" Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.761938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerStarted","Data":"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.762281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerStarted","Data":"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.764378 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76bc1936-4c86-461a-a992-12927ef262e4","Type":"ContainerStarted","Data":"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.766251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4","Type":"ContainerStarted","Data":"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.766376 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54" gracePeriod=30 Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.769170 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerStarted","Data":"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.769237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerStarted","Data":"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57"} Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.769194 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-log" containerID="cri-o://035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" gracePeriod=30 Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.769235 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-metadata" containerID="cri-o://3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" gracePeriod=30 Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.801559 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.160754658 podStartE2EDuration="6.801539486s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="2026-04-06 12:21:07.654336362 +0000 UTC m=+1446.642079228" lastFinishedPulling="2026-04-06 12:21:11.29512119 +0000 UTC m=+1450.282864056" observedRunningTime="2026-04-06 12:21:12.780750172 +0000 UTC m=+1451.768493048" watchObservedRunningTime="2026-04-06 12:21:12.801539486 +0000 UTC m=+1451.789282352" Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.815755 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.760972508 podStartE2EDuration="6.815737979s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="2026-04-06 12:21:08.241560284 +0000 UTC m=+1447.229303150" lastFinishedPulling="2026-04-06 12:21:11.296325755 +0000 UTC m=+1450.284068621" observedRunningTime="2026-04-06 12:21:12.798058975 +0000 UTC m=+1451.785801851" watchObservedRunningTime="2026-04-06 12:21:12.815737979 +0000 UTC m=+1451.803480845" Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.833024 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.368956911 podStartE2EDuration="6.833002251s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="2026-04-06 12:21:07.831183353 +0000 UTC m=+1446.818926219" lastFinishedPulling="2026-04-06 12:21:11.295228693 +0000 UTC m=+1450.282971559" observedRunningTime="2026-04-06 12:21:12.819317453 +0000 UTC m=+1451.807060329" watchObservedRunningTime="2026-04-06 12:21:12.833002251 +0000 UTC m=+1451.820745127" Apr 06 12:21:12 crc kubenswrapper[4790]: I0406 12:21:12.849533 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.244964266 podStartE2EDuration="6.849513041s" podCreationTimestamp="2026-04-06 12:21:06 +0000 UTC" firstStartedPulling="2026-04-06 12:21:07.692144851 +0000 UTC m=+1446.679887717" lastFinishedPulling="2026-04-06 12:21:11.296693626 +0000 UTC m=+1450.284436492" observedRunningTime="2026-04-06 12:21:12.838702277 +0000 UTC m=+1451.826445143" watchObservedRunningTime="2026-04-06 12:21:12.849513041 +0000 UTC m=+1451.837255907" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.398630 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.542596 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data\") pod \"a560347d-227b-46b7-a352-8c10573fb4fc\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.542813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle\") pod \"a560347d-227b-46b7-a352-8c10573fb4fc\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.542901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q8h5\" (UniqueName: \"kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5\") pod \"a560347d-227b-46b7-a352-8c10573fb4fc\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.542991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs\") pod \"a560347d-227b-46b7-a352-8c10573fb4fc\" (UID: \"a560347d-227b-46b7-a352-8c10573fb4fc\") " Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.544587 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs" (OuterVolumeSpecName: "logs") pod "a560347d-227b-46b7-a352-8c10573fb4fc" (UID: "a560347d-227b-46b7-a352-8c10573fb4fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.568325 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5" (OuterVolumeSpecName: "kube-api-access-6q8h5") pod "a560347d-227b-46b7-a352-8c10573fb4fc" (UID: "a560347d-227b-46b7-a352-8c10573fb4fc"). InnerVolumeSpecName "kube-api-access-6q8h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.578105 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data" (OuterVolumeSpecName: "config-data") pod "a560347d-227b-46b7-a352-8c10573fb4fc" (UID: "a560347d-227b-46b7-a352-8c10573fb4fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.601760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a560347d-227b-46b7-a352-8c10573fb4fc" (UID: "a560347d-227b-46b7-a352-8c10573fb4fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.636291 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.636534 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" containerName="kube-state-metrics" containerID="cri-o://3d9de21481224150864c525cf0720a94ee01cfcf245f69c9e55e4f8ccc97e649" gracePeriod=30 Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.645479 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.645518 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q8h5\" (UniqueName: \"kubernetes.io/projected/a560347d-227b-46b7-a352-8c10573fb4fc-kube-api-access-6q8h5\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.645532 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a560347d-227b-46b7-a352-8c10573fb4fc-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.645543 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a560347d-227b-46b7-a352-8c10573fb4fc-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.789594 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" containerID="3d9de21481224150864c525cf0720a94ee01cfcf245f69c9e55e4f8ccc97e649" exitCode=2 Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.789654 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2ad9a9e-f3a7-4749-9838-be88d29a7494","Type":"ContainerDied","Data":"3d9de21481224150864c525cf0720a94ee01cfcf245f69c9e55e4f8ccc97e649"} Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.797014 4790 generic.go:334] "Generic (PLEG): container finished" podID="a560347d-227b-46b7-a352-8c10573fb4fc" containerID="3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" exitCode=0 Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.797041 4790 generic.go:334] "Generic (PLEG): container finished" podID="a560347d-227b-46b7-a352-8c10573fb4fc" containerID="035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" exitCode=143 Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.797949 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.806251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerDied","Data":"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d"} Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.806298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerDied","Data":"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57"} Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.806309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a560347d-227b-46b7-a352-8c10573fb4fc","Type":"ContainerDied","Data":"ffad5549e66dcb7a6954a8be0288a899a7ba99f5007608b259bf2463ab986e48"} Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.806325 4790 scope.go:117] "RemoveContainer" containerID="3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.852145 4790 scope.go:117] "RemoveContainer" containerID="035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.856326 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.867375 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.879390 4790 scope.go:117] "RemoveContainer" containerID="3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.881789 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d\": container with ID starting with 3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d not found: ID does not exist" containerID="3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.881838 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d"} err="failed to get container status \"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d\": rpc error: code = NotFound desc = could not find container \"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d\": container with ID starting with 3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d not found: ID does not exist" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.881866 4790 scope.go:117] "RemoveContainer" containerID="035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.882262 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57\": container with ID starting with 035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57 not found: ID does not exist" containerID="035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.882299 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57"} err="failed to get container status \"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57\": rpc error: code = NotFound desc = could not find container \"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57\": container with ID starting with 035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57 not found: ID does not exist" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.882323 4790 scope.go:117] "RemoveContainer" containerID="3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.882616 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d"} err="failed to get container status \"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d\": rpc error: code = NotFound desc = could not find container \"3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d\": container with ID starting with 3b5895484766ef043bc874582b0a890ac98b6fd223636dfcce2ef340da6c6d5d not found: ID does not exist" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.882643 4790 scope.go:117] "RemoveContainer" containerID="035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.882856 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57"} err="failed to get container status \"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57\": rpc error: code = NotFound desc = could not find container \"035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57\": container with ID starting with 035ae4811e54fe9c103611a15bbb252490cca4ec7b44bca52fd0f9199c563c57 not found: ID does not exist" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891166 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.891584 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="registry-server" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891600 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="registry-server" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.891612 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="extract-content" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891618 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="extract-content" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.891631 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-metadata" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891637 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-metadata" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.891655 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-log" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891661 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-log" Apr 06 12:21:13 crc kubenswrapper[4790]: E0406 12:21:13.891672 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="extract-utilities" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891678 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="extract-utilities" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891906 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-metadata" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891932 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1faae3f1-4b30-4273-baa5-8b2790ddff2a" containerName="registry-server" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.891941 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" containerName="nova-metadata-log" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.892923 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.899530 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.899906 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 06 12:21:13 crc kubenswrapper[4790]: I0406 12:21:13.916040 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.057810 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.057870 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.057898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntzz\" (UniqueName: \"kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.057972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.058041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.084622 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.159672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.159978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.160005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntzz\" (UniqueName: \"kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.160061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.160107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.160629 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.166977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.167574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.169285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.208395 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntzz\" (UniqueName: \"kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz\") pod \"nova-metadata-0\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.237667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.264710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bqt5\" (UniqueName: \"kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5\") pod \"e2ad9a9e-f3a7-4749-9838-be88d29a7494\" (UID: \"e2ad9a9e-f3a7-4749-9838-be88d29a7494\") " Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.278051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5" (OuterVolumeSpecName: "kube-api-access-5bqt5") pod "e2ad9a9e-f3a7-4749-9838-be88d29a7494" (UID: "e2ad9a9e-f3a7-4749-9838-be88d29a7494"). InnerVolumeSpecName "kube-api-access-5bqt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.381191 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bqt5\" (UniqueName: \"kubernetes.io/projected/e2ad9a9e-f3a7-4749-9838-be88d29a7494-kube-api-access-5bqt5\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.778668 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.809134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2ad9a9e-f3a7-4749-9838-be88d29a7494","Type":"ContainerDied","Data":"809658d936309e1e08e768c1d749d29951bdece531257ecf895d70d745a4c0ef"} Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.809165 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.809182 4790 scope.go:117] "RemoveContainer" containerID="3d9de21481224150864c525cf0720a94ee01cfcf245f69c9e55e4f8ccc97e649" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.810925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerStarted","Data":"440012d2df792859716e192f9a0f2c7fc8d89170391cfb261e88de8896a259fb"} Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.859578 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.878616 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.896017 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: E0406 12:21:14.896796 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" containerName="kube-state-metrics" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.896823 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" containerName="kube-state-metrics" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.897173 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" containerName="kube-state-metrics" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.898035 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.900944 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.903806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.908698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.997699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.997767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dpxg\" (UniqueName: \"kubernetes.io/projected/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-api-access-2dpxg\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.997822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:14 crc kubenswrapper[4790]: I0406 12:21:14.997887 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.099506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.099664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.099730 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dpxg\" (UniqueName: \"kubernetes.io/projected/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-api-access-2dpxg\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.099771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.106271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.106323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.115391 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.119604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dpxg\" (UniqueName: \"kubernetes.io/projected/d656612b-aaad-4d40-bc05-3aae06b509f3-kube-api-access-2dpxg\") pod \"kube-state-metrics-0\" (UID: \"d656612b-aaad-4d40-bc05-3aae06b509f3\") " pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.312231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.688295 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a560347d-227b-46b7-a352-8c10573fb4fc" path="/var/lib/kubelet/pods/a560347d-227b-46b7-a352-8c10573fb4fc/volumes" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.689176 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2ad9a9e-f3a7-4749-9838-be88d29a7494" path="/var/lib/kubelet/pods/e2ad9a9e-f3a7-4749-9838-be88d29a7494/volumes" Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.830965 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerStarted","Data":"9efbae1034966c4a857f5f4f8ed6c4e69e76fd91badd0dedaa8e60d4f1915b5e"} Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.831020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerStarted","Data":"fe0b0ca31b874d5a16ba65269452bc82afb541e36cfc3f9f898d1a8ce6161413"} Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.859528 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.872878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Apr 06 12:21:15 crc kubenswrapper[4790]: I0406 12:21:15.877463 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.877441782 podStartE2EDuration="2.877441782s" podCreationTimestamp="2026-04-06 12:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:15.850180869 +0000 UTC m=+1454.837923735" watchObservedRunningTime="2026-04-06 12:21:15.877441782 +0000 UTC m=+1454.865184648" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.098233 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.098880 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-central-agent" containerID="cri-o://d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7" gracePeriod=30 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.099122 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="proxy-httpd" containerID="cri-o://af26e769ba3a86afe83a7de61d1ef7c66c0dd157c41639008c693bec62c0085b" gracePeriod=30 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.099166 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-notification-agent" containerID="cri-o://571317242b82d964f97bc0861956ca6e7b71a5f8a05f70dfdc5becb2f16cf83e" gracePeriod=30 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.099261 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="sg-core" containerID="cri-o://d7ebac945324e1d3970bd05a7428bf42aa33864abe32dc70e1fd5a5d20fb496c" gracePeriod=30 Apr 06 12:21:16 crc kubenswrapper[4790]: E0406 12:21:16.540800 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaba66e9_85bd_4d49_bd1c_0facd7ae5d41.slice/crio-d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.848651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d656612b-aaad-4d40-bc05-3aae06b509f3","Type":"ContainerStarted","Data":"9f5bdede911bc91b587dc1aae4deeec0057cecb0f90ea7d35b5ae5d6007b1f83"} Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.848821 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d656612b-aaad-4d40-bc05-3aae06b509f3","Type":"ContainerStarted","Data":"680db8de4ddd932dafe336ff17429a5ec5c0fe29cda2f132cee69df779f565e3"} Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.848859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852746 4790 generic.go:334] "Generic (PLEG): container finished" podID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerID="af26e769ba3a86afe83a7de61d1ef7c66c0dd157c41639008c693bec62c0085b" exitCode=0 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852781 4790 generic.go:334] "Generic (PLEG): container finished" podID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerID="d7ebac945324e1d3970bd05a7428bf42aa33864abe32dc70e1fd5a5d20fb496c" exitCode=2 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852791 4790 generic.go:334] "Generic (PLEG): container finished" podID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerID="d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7" exitCode=0 Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerDied","Data":"af26e769ba3a86afe83a7de61d1ef7c66c0dd157c41639008c693bec62c0085b"} Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerDied","Data":"d7ebac945324e1d3970bd05a7428bf42aa33864abe32dc70e1fd5a5d20fb496c"} Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.852891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerDied","Data":"d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7"} Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.906897 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.971161 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.971458 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.992490 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:21:16 crc kubenswrapper[4790]: I0406 12:21:16.992527 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.008577 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.026400 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.657445109 podStartE2EDuration="3.026380855s" podCreationTimestamp="2026-04-06 12:21:14 +0000 UTC" firstStartedPulling="2026-04-06 12:21:15.859353636 +0000 UTC m=+1454.847096502" lastFinishedPulling="2026-04-06 12:21:16.228289382 +0000 UTC m=+1455.216032248" observedRunningTime="2026-04-06 12:21:16.865465077 +0000 UTC m=+1455.853207943" watchObservedRunningTime="2026-04-06 12:21:17.026380855 +0000 UTC m=+1456.014123731" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.331033 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.394371 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.394641 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="dnsmasq-dns" containerID="cri-o://b14a9d2e3ec91ebd1630ec0c769361b2e4018a864b3249c674142b8785d39eeb" gracePeriod=10 Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.543868 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: connect: connection refused" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.872958 4790 generic.go:334] "Generic (PLEG): container finished" podID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerID="b14a9d2e3ec91ebd1630ec0c769361b2e4018a864b3249c674142b8785d39eeb" exitCode=0 Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.873167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" event={"ID":"26a2b37d-3798-4197-b112-822d6b7e3f5b","Type":"ContainerDied","Data":"b14a9d2e3ec91ebd1630ec0c769361b2e4018a864b3249c674142b8785d39eeb"} Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.873196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" event={"ID":"26a2b37d-3798-4197-b112-822d6b7e3f5b","Type":"ContainerDied","Data":"ececf77662fc6c033168d6bbc78e65b17f662b84c780e12d10ede5b325dcbc61"} Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.873207 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ececf77662fc6c033168d6bbc78e65b17f662b84c780e12d10ede5b325dcbc61" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.874919 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ff20ebb-f795-4d0c-8836-ede867897d49" containerID="37d97592f7406c9d4fc98603e8b17536cd78e6d94bea0809d43be773c95b9706" exitCode=0 Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.878053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s89q4" event={"ID":"6ff20ebb-f795-4d0c-8836-ede867897d49","Type":"ContainerDied","Data":"37d97592f7406c9d4fc98603e8b17536cd78e6d94bea0809d43be773c95b9706"} Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.922617 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 06 12:21:17 crc kubenswrapper[4790]: I0406 12:21:17.936953 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059739 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsmg4\" (UniqueName: \"kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.059991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config\") pod \"26a2b37d-3798-4197-b112-822d6b7e3f5b\" (UID: \"26a2b37d-3798-4197-b112-822d6b7e3f5b\") " Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.076056 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.076028 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.229:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.084933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4" (OuterVolumeSpecName: "kube-api-access-bsmg4") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "kube-api-access-bsmg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.112307 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.135736 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.137552 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.147320 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.162382 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsmg4\" (UniqueName: \"kubernetes.io/projected/26a2b37d-3798-4197-b112-822d6b7e3f5b-kube-api-access-bsmg4\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.162411 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.162420 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.162429 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.162437 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.165714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config" (OuterVolumeSpecName: "config") pod "26a2b37d-3798-4197-b112-822d6b7e3f5b" (UID: "26a2b37d-3798-4197-b112-822d6b7e3f5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.264159 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26a2b37d-3798-4197-b112-822d6b7e3f5b-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.886381 4790 generic.go:334] "Generic (PLEG): container finished" podID="b1f21193-0d83-4045-9359-ec1228ed6d34" containerID="f73fd47ecb6bb249b4c9063e0e5e2661ae112feb32e11555a04df2d7062365a4" exitCode=0 Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.886498 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8f5f7879-l4wlk" Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.893039 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtb78" event={"ID":"b1f21193-0d83-4045-9359-ec1228ed6d34","Type":"ContainerDied","Data":"f73fd47ecb6bb249b4c9063e0e5e2661ae112feb32e11555a04df2d7062365a4"} Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.944914 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:21:18 crc kubenswrapper[4790]: I0406 12:21:18.953711 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8f5f7879-l4wlk"] Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.332979 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.485156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle\") pod \"6ff20ebb-f795-4d0c-8836-ede867897d49\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.485201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts\") pod \"6ff20ebb-f795-4d0c-8836-ede867897d49\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.485305 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data\") pod \"6ff20ebb-f795-4d0c-8836-ede867897d49\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.485366 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cj9n\" (UniqueName: \"kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n\") pod \"6ff20ebb-f795-4d0c-8836-ede867897d49\" (UID: \"6ff20ebb-f795-4d0c-8836-ede867897d49\") " Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.490735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts" (OuterVolumeSpecName: "scripts") pod "6ff20ebb-f795-4d0c-8836-ede867897d49" (UID: "6ff20ebb-f795-4d0c-8836-ede867897d49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.492820 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n" (OuterVolumeSpecName: "kube-api-access-8cj9n") pod "6ff20ebb-f795-4d0c-8836-ede867897d49" (UID: "6ff20ebb-f795-4d0c-8836-ede867897d49"). InnerVolumeSpecName "kube-api-access-8cj9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.515681 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ff20ebb-f795-4d0c-8836-ede867897d49" (UID: "6ff20ebb-f795-4d0c-8836-ede867897d49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.527515 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data" (OuterVolumeSpecName: "config-data") pod "6ff20ebb-f795-4d0c-8836-ede867897d49" (UID: "6ff20ebb-f795-4d0c-8836-ede867897d49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.588055 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.588096 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cj9n\" (UniqueName: \"kubernetes.io/projected/6ff20ebb-f795-4d0c-8836-ede867897d49-kube-api-access-8cj9n\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.588112 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.588124 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ff20ebb-f795-4d0c-8836-ede867897d49-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.688963 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" path="/var/lib/kubelet/pods/26a2b37d-3798-4197-b112-822d6b7e3f5b/volumes" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.897594 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s89q4" event={"ID":"6ff20ebb-f795-4d0c-8836-ede867897d49","Type":"ContainerDied","Data":"8133276b0e5e826fb08cf8352390c205394048a35c2ebb5f7f7d4bc4c667f50b"} Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.897634 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8133276b0e5e826fb08cf8352390c205394048a35c2ebb5f7f7d4bc4c667f50b" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.897641 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s89q4" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.900735 4790 generic.go:334] "Generic (PLEG): container finished" podID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerID="571317242b82d964f97bc0861956ca6e7b71a5f8a05f70dfdc5becb2f16cf83e" exitCode=0 Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.900816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerDied","Data":"571317242b82d964f97bc0861956ca6e7b71a5f8a05f70dfdc5becb2f16cf83e"} Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.900875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41","Type":"ContainerDied","Data":"dc9bc6f32547a260f6f8540bb338d37ea1a78ba805c4a692d53073d495df4042"} Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.900890 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc9bc6f32547a260f6f8540bb338d37ea1a78ba805c4a692d53073d495df4042" Apr 06 12:21:19 crc kubenswrapper[4790]: I0406 12:21:19.947872 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.084398 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.084691 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-log" containerID="cri-o://9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b" gracePeriod=30 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.084948 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-api" containerID="cri-o://a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d" gracePeriod=30 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.101625 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.102716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.102803 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.102866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.102929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6skn6\" (UniqueName: \"kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.102998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.103048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.103085 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data\") pod \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\" (UID: \"eaba66e9-85bd-4d49-bd1c-0facd7ae5d41\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.105054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.105271 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.121224 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6" (OuterVolumeSpecName: "kube-api-access-6skn6") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "kube-api-access-6skn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.136513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts" (OuterVolumeSpecName: "scripts") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.178318 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.186415 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.186667 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-log" containerID="cri-o://fe0b0ca31b874d5a16ba65269452bc82afb541e36cfc3f9f898d1a8ce6161413" gracePeriod=30 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.186946 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-metadata" containerID="cri-o://9efbae1034966c4a857f5f4f8ed6c4e69e76fd91badd0dedaa8e60d4f1915b5e" gracePeriod=30 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.206805 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.206840 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.206848 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.206858 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6skn6\" (UniqueName: \"kubernetes.io/projected/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-kube-api-access-6skn6\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.206868 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.239572 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.263242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data" (OuterVolumeSpecName: "config-data") pod "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" (UID: "eaba66e9-85bd-4d49-bd1c-0facd7ae5d41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.309506 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.309537 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.314484 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.410658 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle\") pod \"b1f21193-0d83-4045-9359-ec1228ed6d34\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.410718 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts\") pod \"b1f21193-0d83-4045-9359-ec1228ed6d34\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.410973 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwzqz\" (UniqueName: \"kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz\") pod \"b1f21193-0d83-4045-9359-ec1228ed6d34\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.411019 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data\") pod \"b1f21193-0d83-4045-9359-ec1228ed6d34\" (UID: \"b1f21193-0d83-4045-9359-ec1228ed6d34\") " Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.416169 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts" (OuterVolumeSpecName: "scripts") pod "b1f21193-0d83-4045-9359-ec1228ed6d34" (UID: "b1f21193-0d83-4045-9359-ec1228ed6d34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.421513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz" (OuterVolumeSpecName: "kube-api-access-lwzqz") pod "b1f21193-0d83-4045-9359-ec1228ed6d34" (UID: "b1f21193-0d83-4045-9359-ec1228ed6d34"). InnerVolumeSpecName "kube-api-access-lwzqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.455933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1f21193-0d83-4045-9359-ec1228ed6d34" (UID: "b1f21193-0d83-4045-9359-ec1228ed6d34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.455982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data" (OuterVolumeSpecName: "config-data") pod "b1f21193-0d83-4045-9359-ec1228ed6d34" (UID: "b1f21193-0d83-4045-9359-ec1228ed6d34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.513441 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwzqz\" (UniqueName: \"kubernetes.io/projected/b1f21193-0d83-4045-9359-ec1228ed6d34-kube-api-access-lwzqz\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.513739 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.513750 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.513758 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f21193-0d83-4045-9359-ec1228ed6d34-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919780 4790 generic.go:334] "Generic (PLEG): container finished" podID="64346632-5650-4f85-aaf8-8253c488d521" containerID="9efbae1034966c4a857f5f4f8ed6c4e69e76fd91badd0dedaa8e60d4f1915b5e" exitCode=0 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919810 4790 generic.go:334] "Generic (PLEG): container finished" podID="64346632-5650-4f85-aaf8-8253c488d521" containerID="fe0b0ca31b874d5a16ba65269452bc82afb541e36cfc3f9f898d1a8ce6161413" exitCode=143 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerDied","Data":"9efbae1034966c4a857f5f4f8ed6c4e69e76fd91badd0dedaa8e60d4f1915b5e"} Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerDied","Data":"fe0b0ca31b874d5a16ba65269452bc82afb541e36cfc3f9f898d1a8ce6161413"} Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919965 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64346632-5650-4f85-aaf8-8253c488d521","Type":"ContainerDied","Data":"440012d2df792859716e192f9a0f2c7fc8d89170391cfb261e88de8896a259fb"} Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.919983 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440012d2df792859716e192f9a0f2c7fc8d89170391cfb261e88de8896a259fb" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.921675 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtb78" event={"ID":"b1f21193-0d83-4045-9359-ec1228ed6d34","Type":"ContainerDied","Data":"ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4"} Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.921691 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtb78" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.921707 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab38eb8bc83574831dbf7e870fb70b7f8d15b68d28c30ab226367a11492121c4" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.923206 4790 generic.go:334] "Generic (PLEG): container finished" podID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerID="9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b" exitCode=143 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.923277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerDied","Data":"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b"} Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.923359 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="76bc1936-4c86-461a-a992-12927ef262e4" containerName="nova-scheduler-scheduler" containerID="cri-o://4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" gracePeriod=30 Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.923423 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998188 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998672 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="init" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998691 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="init" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998706 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-central-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998713 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-central-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998723 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="proxy-httpd" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998729 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="proxy-httpd" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998747 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-notification-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998753 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-notification-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998773 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff20ebb-f795-4d0c-8836-ede867897d49" containerName="nova-manage" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998778 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff20ebb-f795-4d0c-8836-ede867897d49" containerName="nova-manage" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998789 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="sg-core" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998795 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="sg-core" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998811 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="dnsmasq-dns" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998817 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="dnsmasq-dns" Apr 06 12:21:20 crc kubenswrapper[4790]: E0406 12:21:20.998847 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f21193-0d83-4045-9359-ec1228ed6d34" containerName="nova-cell1-conductor-db-sync" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.998856 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f21193-0d83-4045-9359-ec1228ed6d34" containerName="nova-cell1-conductor-db-sync" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999039 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff20ebb-f795-4d0c-8836-ede867897d49" containerName="nova-manage" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999059 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="proxy-httpd" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999072 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f21193-0d83-4045-9359-ec1228ed6d34" containerName="nova-cell1-conductor-db-sync" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999088 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-central-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999096 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="ceilometer-notification-agent" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999106 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a2b37d-3798-4197-b112-822d6b7e3f5b" containerName="dnsmasq-dns" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999123 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" containerName="sg-core" Apr 06 12:21:20 crc kubenswrapper[4790]: I0406 12:21:20.999769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.002246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.025907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.041423 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.118960 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.130605 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.141513 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.141990 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-metadata" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.142008 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-metadata" Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.142021 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-log" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.142028 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-log" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.142218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-metadata" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.142245 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="64346632-5650-4f85-aaf8-8253c488d521" containerName="nova-metadata-log" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.143985 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs\") pod \"64346632-5650-4f85-aaf8-8253c488d521\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144075 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle\") pod \"64346632-5650-4f85-aaf8-8253c488d521\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144198 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data\") pod \"64346632-5650-4f85-aaf8-8253c488d521\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144261 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs\") pod \"64346632-5650-4f85-aaf8-8253c488d521\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144284 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hntzz\" (UniqueName: \"kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz\") pod \"64346632-5650-4f85-aaf8-8253c488d521\" (UID: \"64346632-5650-4f85-aaf8-8253c488d521\") " Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144628 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7h84\" (UniqueName: \"kubernetes.io/projected/96ad1416-d1af-42b2-8fae-68574044a5e6-kube-api-access-m7h84\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144879 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.144912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs" (OuterVolumeSpecName: "logs") pod "64346632-5650-4f85-aaf8-8253c488d521" (UID: "64346632-5650-4f85-aaf8-8253c488d521"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.151713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.151732 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.151713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.154270 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.155122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz" (OuterVolumeSpecName: "kube-api-access-hntzz") pod "64346632-5650-4f85-aaf8-8253c488d521" (UID: "64346632-5650-4f85-aaf8-8253c488d521"). InnerVolumeSpecName "kube-api-access-hntzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.193196 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64346632-5650-4f85-aaf8-8253c488d521" (UID: "64346632-5650-4f85-aaf8-8253c488d521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.195650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data" (OuterVolumeSpecName: "config-data") pod "64346632-5650-4f85-aaf8-8253c488d521" (UID: "64346632-5650-4f85-aaf8-8253c488d521"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.231230 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "64346632-5650-4f85-aaf8-8253c488d521" (UID: "64346632-5650-4f85-aaf8-8253c488d521"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7h84\" (UniqueName: \"kubernetes.io/projected/96ad1416-d1af-42b2-8fae-68574044a5e6-kube-api-access-m7h84\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246592 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm449\" (UniqueName: \"kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246802 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246816 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246843 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/64346632-5650-4f85-aaf8-8253c488d521-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.246853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hntzz\" (UniqueName: \"kubernetes.io/projected/64346632-5650-4f85-aaf8-8253c488d521-kube-api-access-hntzz\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.247962 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64346632-5650-4f85-aaf8-8253c488d521-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.250629 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.253814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ad1416-d1af-42b2-8fae-68574044a5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.261352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7h84\" (UniqueName: \"kubernetes.io/projected/96ad1416-d1af-42b2-8fae-68574044a5e6-kube-api-access-m7h84\") pod \"nova-cell1-conductor-0\" (UID: \"96ad1416-d1af-42b2-8fae-68574044a5e6\") " pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.349685 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350304 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm449\" (UniqueName: \"kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.350573 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.351390 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.356204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.356474 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.356980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.357130 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.360515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.361206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.379893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.391333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm449\" (UniqueName: \"kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449\") pod \"ceilometer-0\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.512315 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.693731 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaba66e9-85bd-4d49-bd1c-0facd7ae5d41" path="/var/lib/kubelet/pods/eaba66e9-85bd-4d49-bd1c-0facd7ae5d41/volumes" Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.930472 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Apr 06 12:21:21 crc kubenswrapper[4790]: I0406 12:21:21.951270 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.974002 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.977167 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.979281 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:21:21 crc kubenswrapper[4790]: E0406 12:21:21.979323 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="76bc1936-4c86-461a-a992-12927ef262e4" containerName="nova-scheduler-scheduler" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.098667 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.134729 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.155419 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.170357 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.172670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.175156 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.175327 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.196088 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.269757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp865\" (UniqueName: \"kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.269875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.269997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.270117 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.270164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.371985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.372171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.372316 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp865\" (UniqueName: \"kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.372477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.372551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.372670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.376525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.377365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.377502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.391961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp865\" (UniqueName: \"kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865\") pod \"nova-metadata-0\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.511297 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.881515 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.965978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerStarted","Data":"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.966027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerStarted","Data":"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.966037 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerStarted","Data":"2fb0089ddaf02d837a8db34e101027b8699d171ba4b172dcd1253a80a8abd1f1"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.967802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"96ad1416-d1af-42b2-8fae-68574044a5e6","Type":"ContainerStarted","Data":"35cf607e0f0946c15c7254cf5b8f0142ba4ca6fc2efb98d3025553d2836d7c31"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.967916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"96ad1416-d1af-42b2-8fae-68574044a5e6","Type":"ContainerStarted","Data":"bc5f09f59f61d903437d875db9a59460bdf4191f7f2d92af6540385d0edd1bb9"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.969399 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.974984 4790 generic.go:334] "Generic (PLEG): container finished" podID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerID="a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d" exitCode=0 Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.975031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerDied","Data":"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.975057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5737e9b4-7722-4a48-8f87-552403a32ebc","Type":"ContainerDied","Data":"4cc83bcfec1df8c3759b580d54d984031bce15646c0901a0b1193d3b3f3698aa"} Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.975094 4790 scope.go:117] "RemoveContainer" containerID="a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.975244 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.995021 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle\") pod \"5737e9b4-7722-4a48-8f87-552403a32ebc\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.995193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs\") pod \"5737e9b4-7722-4a48-8f87-552403a32ebc\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.995223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnm85\" (UniqueName: \"kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85\") pod \"5737e9b4-7722-4a48-8f87-552403a32ebc\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " Apr 06 12:21:22 crc kubenswrapper[4790]: I0406 12:21:22.995371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data\") pod \"5737e9b4-7722-4a48-8f87-552403a32ebc\" (UID: \"5737e9b4-7722-4a48-8f87-552403a32ebc\") " Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.009517 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs" (OuterVolumeSpecName: "logs") pod "5737e9b4-7722-4a48-8f87-552403a32ebc" (UID: "5737e9b4-7722-4a48-8f87-552403a32ebc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.015203 4790 scope.go:117] "RemoveContainer" containerID="9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.025055 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.024980221 podStartE2EDuration="3.024980221s" podCreationTimestamp="2026-04-06 12:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:23.009491241 +0000 UTC m=+1461.997234117" watchObservedRunningTime="2026-04-06 12:21:23.024980221 +0000 UTC m=+1462.012723097" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.032339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85" (OuterVolumeSpecName: "kube-api-access-cnm85") pod "5737e9b4-7722-4a48-8f87-552403a32ebc" (UID: "5737e9b4-7722-4a48-8f87-552403a32ebc"). InnerVolumeSpecName "kube-api-access-cnm85". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.059071 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.061352 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data" (OuterVolumeSpecName: "config-data") pod "5737e9b4-7722-4a48-8f87-552403a32ebc" (UID: "5737e9b4-7722-4a48-8f87-552403a32ebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.079896 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5737e9b4-7722-4a48-8f87-552403a32ebc" (UID: "5737e9b4-7722-4a48-8f87-552403a32ebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.082704 4790 scope.go:117] "RemoveContainer" containerID="a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d" Apr 06 12:21:23 crc kubenswrapper[4790]: E0406 12:21:23.083149 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d\": container with ID starting with a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d not found: ID does not exist" containerID="a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.083192 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d"} err="failed to get container status \"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d\": rpc error: code = NotFound desc = could not find container \"a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d\": container with ID starting with a17406adafe6b2ab98bbab9447ec1ed846598ee468f9fd99c25cc6fd404e0b0d not found: ID does not exist" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.083217 4790 scope.go:117] "RemoveContainer" containerID="9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b" Apr 06 12:21:23 crc kubenswrapper[4790]: E0406 12:21:23.083391 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b\": container with ID starting with 9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b not found: ID does not exist" containerID="9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.083416 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b"} err="failed to get container status \"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b\": rpc error: code = NotFound desc = could not find container \"9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b\": container with ID starting with 9bdecdeedd5f07db70950bfcd9598d89e4ca1ef4a1bcb7b60cf485f45954db7b not found: ID does not exist" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.098817 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5737e9b4-7722-4a48-8f87-552403a32ebc-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.098865 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnm85\" (UniqueName: \"kubernetes.io/projected/5737e9b4-7722-4a48-8f87-552403a32ebc-kube-api-access-cnm85\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.098877 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.098887 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737e9b4-7722-4a48-8f87-552403a32ebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.335445 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.358479 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.376191 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:23 crc kubenswrapper[4790]: E0406 12:21:23.376667 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-api" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.376684 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-api" Apr 06 12:21:23 crc kubenswrapper[4790]: E0406 12:21:23.376706 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-log" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.376712 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-log" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.376920 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-log" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.376949 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" containerName="nova-api-api" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.377932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.380729 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.383469 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.509136 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.509194 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr792\" (UniqueName: \"kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.509403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.509457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.610764 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.610841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr792\" (UniqueName: \"kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.610913 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.610972 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.611350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.615901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.620432 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.631754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr792\" (UniqueName: \"kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792\") pod \"nova-api-0\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.687119 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5737e9b4-7722-4a48-8f87-552403a32ebc" path="/var/lib/kubelet/pods/5737e9b4-7722-4a48-8f87-552403a32ebc/volumes" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.688079 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64346632-5650-4f85-aaf8-8253c488d521" path="/var/lib/kubelet/pods/64346632-5650-4f85-aaf8-8253c488d521/volumes" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.699086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.988421 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerStarted","Data":"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda"} Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.988769 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerStarted","Data":"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c"} Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.988787 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerStarted","Data":"39d7b605634002dd238afa2ae5bb11262261c71cf47d1276d76da6cde6c91f22"} Apr 06 12:21:23 crc kubenswrapper[4790]: I0406 12:21:23.992096 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerStarted","Data":"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53"} Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.010911 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.010891955 podStartE2EDuration="2.010891955s" podCreationTimestamp="2026-04-06 12:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:24.004305954 +0000 UTC m=+1462.992048820" watchObservedRunningTime="2026-04-06 12:21:24.010891955 +0000 UTC m=+1462.998634821" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.152608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.753414 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.839356 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle\") pod \"76bc1936-4c86-461a-a992-12927ef262e4\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.839479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcbv6\" (UniqueName: \"kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6\") pod \"76bc1936-4c86-461a-a992-12927ef262e4\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.839642 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data\") pod \"76bc1936-4c86-461a-a992-12927ef262e4\" (UID: \"76bc1936-4c86-461a-a992-12927ef262e4\") " Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.845101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6" (OuterVolumeSpecName: "kube-api-access-bcbv6") pod "76bc1936-4c86-461a-a992-12927ef262e4" (UID: "76bc1936-4c86-461a-a992-12927ef262e4"). InnerVolumeSpecName "kube-api-access-bcbv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.869892 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data" (OuterVolumeSpecName: "config-data") pod "76bc1936-4c86-461a-a992-12927ef262e4" (UID: "76bc1936-4c86-461a-a992-12927ef262e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.870448 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76bc1936-4c86-461a-a992-12927ef262e4" (UID: "76bc1936-4c86-461a-a992-12927ef262e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.942267 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcbv6\" (UniqueName: \"kubernetes.io/projected/76bc1936-4c86-461a-a992-12927ef262e4-kube-api-access-bcbv6\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.942301 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:24 crc kubenswrapper[4790]: I0406 12:21:24.942311 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76bc1936-4c86-461a-a992-12927ef262e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.011743 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerStarted","Data":"3a865c3f552715b3c6ecd93114c9cb35fc5f993066cc23ffb594f3903b528213"} Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.011819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerStarted","Data":"6bec3ee22ed7abe2881cffebf2e206c83a30b2c40c957ee46b7d98a401726b51"} Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.011865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerStarted","Data":"f06ef39af4d8aaa3156bdf4401a43152b51f65f048c694fe6e5085c7563447ef"} Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.013857 4790 generic.go:334] "Generic (PLEG): container finished" podID="76bc1936-4c86-461a-a992-12927ef262e4" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" exitCode=0 Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.013961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76bc1936-4c86-461a-a992-12927ef262e4","Type":"ContainerDied","Data":"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f"} Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.014028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76bc1936-4c86-461a-a992-12927ef262e4","Type":"ContainerDied","Data":"a966fe9544322bc7ab47a481beacfbc9a5b46978187914b9e5498b0b88cde670"} Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.014073 4790 scope.go:117] "RemoveContainer" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.014086 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.030489 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.030471546 podStartE2EDuration="2.030471546s" podCreationTimestamp="2026-04-06 12:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:25.026395088 +0000 UTC m=+1464.014137954" watchObservedRunningTime="2026-04-06 12:21:25.030471546 +0000 UTC m=+1464.018214402" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.058478 4790 scope.go:117] "RemoveContainer" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" Apr 06 12:21:25 crc kubenswrapper[4790]: E0406 12:21:25.059534 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f\": container with ID starting with 4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f not found: ID does not exist" containerID="4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.059584 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f"} err="failed to get container status \"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f\": rpc error: code = NotFound desc = could not find container \"4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f\": container with ID starting with 4198f696c2100a8198df16dc6b3e874d2c4ae60d6a276e343dc0bd64b28ab28f not found: ID does not exist" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.083031 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.107646 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.122147 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:25 crc kubenswrapper[4790]: E0406 12:21:25.122639 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bc1936-4c86-461a-a992-12927ef262e4" containerName="nova-scheduler-scheduler" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.122656 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bc1936-4c86-461a-a992-12927ef262e4" containerName="nova-scheduler-scheduler" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.122869 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bc1936-4c86-461a-a992-12927ef262e4" containerName="nova-scheduler-scheduler" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.123675 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.126362 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.130032 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.247557 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6x9\" (UniqueName: \"kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.248107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.248344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.320675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.350305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.350407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6x9\" (UniqueName: \"kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.350462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.355530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.356256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.404326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6x9\" (UniqueName: \"kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9\") pod \"nova-scheduler-0\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.416588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.421286 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.435358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.446319 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.556203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.556484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jzm\" (UniqueName: \"kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.556593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.658022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75jzm\" (UniqueName: \"kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.658352 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.658480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.659005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.662969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.701808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jzm\" (UniqueName: \"kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm\") pod \"redhat-operators-bmjmx\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.705401 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76bc1936-4c86-461a-a992-12927ef262e4" path="/var/lib/kubelet/pods/76bc1936-4c86-461a-a992-12927ef262e4/volumes" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.758774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:25 crc kubenswrapper[4790]: I0406 12:21:25.975183 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:21:25 crc kubenswrapper[4790]: W0406 12:21:25.989510 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8fc7d56_62a8_407c_b535_d60c6e81feec.slice/crio-e976caf28b34152ab37553377a44d9c3df52574980c8f158fab0e098371caaed WatchSource:0}: Error finding container e976caf28b34152ab37553377a44d9c3df52574980c8f158fab0e098371caaed: Status 404 returned error can't find the container with id e976caf28b34152ab37553377a44d9c3df52574980c8f158fab0e098371caaed Apr 06 12:21:26 crc kubenswrapper[4790]: I0406 12:21:26.059121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerStarted","Data":"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9"} Apr 06 12:21:26 crc kubenswrapper[4790]: I0406 12:21:26.063083 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:21:26 crc kubenswrapper[4790]: I0406 12:21:26.064264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8fc7d56-62a8-407c-b535-d60c6e81feec","Type":"ContainerStarted","Data":"e976caf28b34152ab37553377a44d9c3df52574980c8f158fab0e098371caaed"} Apr 06 12:21:26 crc kubenswrapper[4790]: I0406 12:21:26.091787 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.232172034 podStartE2EDuration="5.091770142s" podCreationTimestamp="2026-04-06 12:21:21 +0000 UTC" firstStartedPulling="2026-04-06 12:21:22.112306307 +0000 UTC m=+1461.100049173" lastFinishedPulling="2026-04-06 12:21:24.971904415 +0000 UTC m=+1463.959647281" observedRunningTime="2026-04-06 12:21:26.081916865 +0000 UTC m=+1465.069659721" watchObservedRunningTime="2026-04-06 12:21:26.091770142 +0000 UTC m=+1465.079513008" Apr 06 12:21:26 crc kubenswrapper[4790]: W0406 12:21:26.230761 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8db8bab_8ce9_4fdc_b237_f3e1126f92ab.slice/crio-3ed88c19150e7a97b40fbb9492af2553d447bfa5f1b342542d7c219d1fca97d5 WatchSource:0}: Error finding container 3ed88c19150e7a97b40fbb9492af2553d447bfa5f1b342542d7c219d1fca97d5: Status 404 returned error can't find the container with id 3ed88c19150e7a97b40fbb9492af2553d447bfa5f1b342542d7c219d1fca97d5 Apr 06 12:21:26 crc kubenswrapper[4790]: I0406 12:21:26.232276 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:27 crc kubenswrapper[4790]: I0406 12:21:27.074665 4790 generic.go:334] "Generic (PLEG): container finished" podID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerID="530ec5eb7584131d02f075c19ad9573c7111211f21e74417bcf8d37e16de0f75" exitCode=0 Apr 06 12:21:27 crc kubenswrapper[4790]: I0406 12:21:27.074760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerDied","Data":"530ec5eb7584131d02f075c19ad9573c7111211f21e74417bcf8d37e16de0f75"} Apr 06 12:21:27 crc kubenswrapper[4790]: I0406 12:21:27.075129 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerStarted","Data":"3ed88c19150e7a97b40fbb9492af2553d447bfa5f1b342542d7c219d1fca97d5"} Apr 06 12:21:27 crc kubenswrapper[4790]: I0406 12:21:27.077617 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8fc7d56-62a8-407c-b535-d60c6e81feec","Type":"ContainerStarted","Data":"e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6"} Apr 06 12:21:27 crc kubenswrapper[4790]: I0406 12:21:27.124161 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.124145716 podStartE2EDuration="2.124145716s" podCreationTimestamp="2026-04-06 12:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:27.12185314 +0000 UTC m=+1466.109596016" watchObservedRunningTime="2026-04-06 12:21:27.124145716 +0000 UTC m=+1466.111888582" Apr 06 12:21:28 crc kubenswrapper[4790]: I0406 12:21:28.106064 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerStarted","Data":"34cc097ca09840c06488489c5dfaf594d5345d66fa2001a458be42bc066b3df1"} Apr 06 12:21:29 crc kubenswrapper[4790]: I0406 12:21:29.122464 4790 generic.go:334] "Generic (PLEG): container finished" podID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerID="34cc097ca09840c06488489c5dfaf594d5345d66fa2001a458be42bc066b3df1" exitCode=0 Apr 06 12:21:29 crc kubenswrapper[4790]: I0406 12:21:29.122519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerDied","Data":"34cc097ca09840c06488489c5dfaf594d5345d66fa2001a458be42bc066b3df1"} Apr 06 12:21:30 crc kubenswrapper[4790]: I0406 12:21:30.134300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerStarted","Data":"eca3827520ec1135dfd0eb50c575661455582d1e2261b52ba8c132c879e42882"} Apr 06 12:21:30 crc kubenswrapper[4790]: I0406 12:21:30.152247 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bmjmx" podStartSLOduration=2.730633418 podStartE2EDuration="5.152224821s" podCreationTimestamp="2026-04-06 12:21:25 +0000 UTC" firstStartedPulling="2026-04-06 12:21:27.076332776 +0000 UTC m=+1466.064075642" lastFinishedPulling="2026-04-06 12:21:29.497924179 +0000 UTC m=+1468.485667045" observedRunningTime="2026-04-06 12:21:30.148721749 +0000 UTC m=+1469.136464645" watchObservedRunningTime="2026-04-06 12:21:30.152224821 +0000 UTC m=+1469.139967697" Apr 06 12:21:30 crc kubenswrapper[4790]: I0406 12:21:30.447350 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 06 12:21:31 crc kubenswrapper[4790]: I0406 12:21:31.384118 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Apr 06 12:21:32 crc kubenswrapper[4790]: I0406 12:21:32.512418 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 06 12:21:32 crc kubenswrapper[4790]: I0406 12:21:32.512756 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 06 12:21:33 crc kubenswrapper[4790]: I0406 12:21:33.523975 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.237:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:33 crc kubenswrapper[4790]: I0406 12:21:33.524008 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.237:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:33 crc kubenswrapper[4790]: I0406 12:21:33.699906 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:21:33 crc kubenswrapper[4790]: I0406 12:21:33.699949 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:21:34 crc kubenswrapper[4790]: I0406 12:21:34.782166 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.238:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:34 crc kubenswrapper[4790]: I0406 12:21:34.782393 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.238:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 12:21:35 crc kubenswrapper[4790]: I0406 12:21:35.447254 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 06 12:21:35 crc kubenswrapper[4790]: I0406 12:21:35.477093 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 06 12:21:35 crc kubenswrapper[4790]: I0406 12:21:35.759758 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:35 crc kubenswrapper[4790]: I0406 12:21:35.759800 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:36 crc kubenswrapper[4790]: I0406 12:21:36.279336 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 06 12:21:36 crc kubenswrapper[4790]: I0406 12:21:36.830055 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bmjmx" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="registry-server" probeResult="failure" output=< Apr 06 12:21:36 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:21:36 crc kubenswrapper[4790]: > Apr 06 12:21:39 crc kubenswrapper[4790]: I0406 12:21:39.753695 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:21:39 crc kubenswrapper[4790]: I0406 12:21:39.754057 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:21:40 crc kubenswrapper[4790]: I0406 12:21:40.511900 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 06 12:21:40 crc kubenswrapper[4790]: I0406 12:21:40.511975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 06 12:21:41 crc kubenswrapper[4790]: I0406 12:21:41.699800 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 06 12:21:41 crc kubenswrapper[4790]: I0406 12:21:41.700145 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 06 12:21:42 crc kubenswrapper[4790]: I0406 12:21:42.520400 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 06 12:21:42 crc kubenswrapper[4790]: I0406 12:21:42.521353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 06 12:21:42 crc kubenswrapper[4790]: I0406 12:21:42.531646 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.230430 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.290072 4790 generic.go:334] "Generic (PLEG): container finished" podID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" containerID="abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54" exitCode=137 Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.290379 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.290645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4","Type":"ContainerDied","Data":"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54"} Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.290700 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4","Type":"ContainerDied","Data":"3376922f262a60e682e630f456a1c9d9d2aa31270ca979c2430f91eca683d9c3"} Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.290718 4790 scope.go:117] "RemoveContainer" containerID="abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.296292 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.313169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle\") pod \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.313266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xksj\" (UniqueName: \"kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj\") pod \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.313375 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data\") pod \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\" (UID: \"fad48575-e1aa-45f2-b8ba-ed7303d1d9e4\") " Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.328367 4790 scope.go:117] "RemoveContainer" containerID="abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54" Apr 06 12:21:43 crc kubenswrapper[4790]: E0406 12:21:43.329273 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54\": container with ID starting with abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54 not found: ID does not exist" containerID="abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.329313 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54"} err="failed to get container status \"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54\": rpc error: code = NotFound desc = could not find container \"abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54\": container with ID starting with abfa804a0c5a95b4acf8fcd06d27adcf3710a0b36b603413d2f45fe3d09dea54 not found: ID does not exist" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.371663 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj" (OuterVolumeSpecName: "kube-api-access-4xksj") pod "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" (UID: "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4"). InnerVolumeSpecName "kube-api-access-4xksj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.375612 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data" (OuterVolumeSpecName: "config-data") pod "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" (UID: "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.376217 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" (UID: "fad48575-e1aa-45f2-b8ba-ed7303d1d9e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.416216 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.416548 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.416563 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xksj\" (UniqueName: \"kubernetes.io/projected/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4-kube-api-access-4xksj\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.622944 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.645039 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.655266 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:43 crc kubenswrapper[4790]: E0406 12:21:43.655671 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" containerName="nova-cell1-novncproxy-novncproxy" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.655688 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" containerName="nova-cell1-novncproxy-novncproxy" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.655924 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" containerName="nova-cell1-novncproxy-novncproxy" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.656556 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.659385 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.659426 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.659551 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.670923 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.693993 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad48575-e1aa-45f2-b8ba-ed7303d1d9e4" path="/var/lib/kubelet/pods/fad48575-e1aa-45f2-b8ba-ed7303d1d9e4/volumes" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.710898 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.712083 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.720217 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.722259 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.722360 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.722420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vfr\" (UniqueName: \"kubernetes.io/projected/8c07d38c-a3ad-48d8-948c-7659351eade5-kube-api-access-84vfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.722684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.722729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.824411 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.824477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.824553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.824629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.824675 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vfr\" (UniqueName: \"kubernetes.io/projected/8c07d38c-a3ad-48d8-948c-7659351eade5-kube-api-access-84vfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.828836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.829109 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.830083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.833708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c07d38c-a3ad-48d8-948c-7659351eade5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.841399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vfr\" (UniqueName: \"kubernetes.io/projected/8c07d38c-a3ad-48d8-948c-7659351eade5-kube-api-access-84vfr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c07d38c-a3ad-48d8-948c-7659351eade5\") " pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:43 crc kubenswrapper[4790]: I0406 12:21:43.972602 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.309331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 06 12:21:44 crc kubenswrapper[4790]: W0406 12:21:44.510163 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c07d38c_a3ad_48d8_948c_7659351eade5.slice/crio-bc77b93f7fa1a4b7d0d47b7f173bfcdc04e6bab341ae1221d21551db8f3022bf WatchSource:0}: Error finding container bc77b93f7fa1a4b7d0d47b7f173bfcdc04e6bab341ae1221d21551db8f3022bf: Status 404 returned error can't find the container with id bc77b93f7fa1a4b7d0d47b7f173bfcdc04e6bab341ae1221d21551db8f3022bf Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.522040 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.561908 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.563605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.579071 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7hm\" (UniqueName: \"kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.656977 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758347 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7hm\" (UniqueName: \"kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.758549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.759379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.760790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.760991 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.761145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.761565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.780972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7hm\" (UniqueName: \"kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm\") pod \"dnsmasq-dns-77f587c5df-58dj2\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:44 crc kubenswrapper[4790]: I0406 12:21:44.940405 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.314586 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c07d38c-a3ad-48d8-948c-7659351eade5","Type":"ContainerStarted","Data":"6a0c7fc85f2c93c76236f1d100dc6137ccc1cc09c42ae1c76a4b310f49c977bf"} Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.314922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c07d38c-a3ad-48d8-948c-7659351eade5","Type":"ContainerStarted","Data":"bc77b93f7fa1a4b7d0d47b7f173bfcdc04e6bab341ae1221d21551db8f3022bf"} Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.328000 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.330895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.348639 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.3486210339999998 podStartE2EDuration="2.348621034s" podCreationTimestamp="2026-04-06 12:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:45.33196503 +0000 UTC m=+1484.319707916" watchObservedRunningTime="2026-04-06 12:21:45.348621034 +0000 UTC m=+1484.336363900" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.348986 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.371446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.371904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.372292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8lws\" (UniqueName: \"kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.416082 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.473631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.473672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8lws\" (UniqueName: \"kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.473743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.474267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.474365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.497049 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8lws\" (UniqueName: \"kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws\") pod \"community-operators-h5tg2\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.651954 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.831135 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:45 crc kubenswrapper[4790]: I0406 12:21:45.928141 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:46 crc kubenswrapper[4790]: I0406 12:21:46.207474 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:21:46 crc kubenswrapper[4790]: I0406 12:21:46.344250 4790 generic.go:334] "Generic (PLEG): container finished" podID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerID="0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26" exitCode=0 Apr 06 12:21:46 crc kubenswrapper[4790]: I0406 12:21:46.344324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" event={"ID":"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6","Type":"ContainerDied","Data":"0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26"} Apr 06 12:21:46 crc kubenswrapper[4790]: I0406 12:21:46.344353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" event={"ID":"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6","Type":"ContainerStarted","Data":"c4d9fd32a4dcda9b291725318eee0caed5d53fc158463570f0b2102c2b33d41a"} Apr 06 12:21:46 crc kubenswrapper[4790]: I0406 12:21:46.354241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerStarted","Data":"3873ad880d3f0d0db7c3610f2f29f7666702fc043c29dea4fb6b9e2f371e4dd4"} Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.099616 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.365709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" event={"ID":"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6","Type":"ContainerStarted","Data":"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0"} Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.366185 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.367918 4790 generic.go:334] "Generic (PLEG): container finished" podID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerID="fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741" exitCode=0 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.368007 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerDied","Data":"fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741"} Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.368136 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-log" containerID="cri-o://6bec3ee22ed7abe2881cffebf2e206c83a30b2c40c957ee46b7d98a401726b51" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.368150 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-api" containerID="cri-o://3a865c3f552715b3c6ecd93114c9cb35fc5f993066cc23ffb594f3903b528213" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.397632 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" podStartSLOduration=3.397608684 podStartE2EDuration="3.397608684s" podCreationTimestamp="2026-04-06 12:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:47.386149771 +0000 UTC m=+1486.373892627" watchObservedRunningTime="2026-04-06 12:21:47.397608684 +0000 UTC m=+1486.385351550" Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.471247 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.471563 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-central-agent" containerID="cri-o://e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.471756 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="proxy-httpd" containerID="cri-o://a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.471816 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="sg-core" containerID="cri-o://2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.471759 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-notification-agent" containerID="cri-o://551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18" gracePeriod=30 Apr 06 12:21:47 crc kubenswrapper[4790]: I0406 12:21:47.498304 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.236:3000/\": EOF" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.093470 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.095207 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bmjmx" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="registry-server" containerID="cri-o://eca3827520ec1135dfd0eb50c575661455582d1e2261b52ba8c132c879e42882" gracePeriod=2 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380051 4790 generic.go:334] "Generic (PLEG): container finished" podID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerID="a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9" exitCode=0 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380089 4790 generic.go:334] "Generic (PLEG): container finished" podID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerID="2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53" exitCode=2 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380098 4790 generic.go:334] "Generic (PLEG): container finished" podID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerID="e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31" exitCode=0 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerDied","Data":"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerDied","Data":"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.380201 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerDied","Data":"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.383171 4790 generic.go:334] "Generic (PLEG): container finished" podID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerID="eca3827520ec1135dfd0eb50c575661455582d1e2261b52ba8c132c879e42882" exitCode=0 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.383225 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerDied","Data":"eca3827520ec1135dfd0eb50c575661455582d1e2261b52ba8c132c879e42882"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.396981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerStarted","Data":"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.404696 4790 generic.go:334] "Generic (PLEG): container finished" podID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerID="3a865c3f552715b3c6ecd93114c9cb35fc5f993066cc23ffb594f3903b528213" exitCode=0 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.404718 4790 generic.go:334] "Generic (PLEG): container finished" podID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerID="6bec3ee22ed7abe2881cffebf2e206c83a30b2c40c957ee46b7d98a401726b51" exitCode=143 Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.405397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerDied","Data":"3a865c3f552715b3c6ecd93114c9cb35fc5f993066cc23ffb594f3903b528213"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.405420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerDied","Data":"6bec3ee22ed7abe2881cffebf2e206c83a30b2c40c957ee46b7d98a401726b51"} Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.672255 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.769628 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.848235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75jzm\" (UniqueName: \"kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm\") pod \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.848353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content\") pod \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.848463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities\") pod \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\" (UID: \"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.849622 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities" (OuterVolumeSpecName: "utilities") pod "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" (UID: "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.853996 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm" (OuterVolumeSpecName: "kube-api-access-75jzm") pod "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" (UID: "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab"). InnerVolumeSpecName "kube-api-access-75jzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.949722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs\") pod \"937bfac1-c6d1-4e86-948f-a7c0144899bb\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.949858 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data\") pod \"937bfac1-c6d1-4e86-948f-a7c0144899bb\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.949937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle\") pod \"937bfac1-c6d1-4e86-948f-a7c0144899bb\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.949978 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr792\" (UniqueName: \"kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792\") pod \"937bfac1-c6d1-4e86-948f-a7c0144899bb\" (UID: \"937bfac1-c6d1-4e86-948f-a7c0144899bb\") " Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.950091 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs" (OuterVolumeSpecName: "logs") pod "937bfac1-c6d1-4e86-948f-a7c0144899bb" (UID: "937bfac1-c6d1-4e86-948f-a7c0144899bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.950561 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/937bfac1-c6d1-4e86-948f-a7c0144899bb-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.950585 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.950596 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75jzm\" (UniqueName: \"kubernetes.io/projected/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-kube-api-access-75jzm\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.966795 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792" (OuterVolumeSpecName: "kube-api-access-tr792") pod "937bfac1-c6d1-4e86-948f-a7c0144899bb" (UID: "937bfac1-c6d1-4e86-948f-a7c0144899bb"). InnerVolumeSpecName "kube-api-access-tr792". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.969501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" (UID: "a8db8bab-8ce9-4fdc-b237-f3e1126f92ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.972935 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.979882 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "937bfac1-c6d1-4e86-948f-a7c0144899bb" (UID: "937bfac1-c6d1-4e86-948f-a7c0144899bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:48 crc kubenswrapper[4790]: I0406 12:21:48.983729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data" (OuterVolumeSpecName: "config-data") pod "937bfac1-c6d1-4e86-948f-a7c0144899bb" (UID: "937bfac1-c6d1-4e86-948f-a7c0144899bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.053043 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.053075 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/937bfac1-c6d1-4e86-948f-a7c0144899bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.053085 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr792\" (UniqueName: \"kubernetes.io/projected/937bfac1-c6d1-4e86-948f-a7c0144899bb-kube-api-access-tr792\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.053094 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.239668 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357725 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.357906 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.358179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm449\" (UniqueName: \"kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.358224 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts\") pod \"50dc8a50-27b4-4160-af1a-f4b6507a0061\" (UID: \"50dc8a50-27b4-4160-af1a-f4b6507a0061\") " Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.359222 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.359537 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.364568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449" (OuterVolumeSpecName: "kube-api-access-pm449") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "kube-api-access-pm449". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.365140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts" (OuterVolumeSpecName: "scripts") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.388678 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.420901 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.420907 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"937bfac1-c6d1-4e86-948f-a7c0144899bb","Type":"ContainerDied","Data":"f06ef39af4d8aaa3156bdf4401a43152b51f65f048c694fe6e5085c7563447ef"} Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.420970 4790 scope.go:117] "RemoveContainer" containerID="3a865c3f552715b3c6ecd93114c9cb35fc5f993066cc23ffb594f3903b528213" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.421821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.433706 4790 generic.go:334] "Generic (PLEG): container finished" podID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerID="551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18" exitCode=0 Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.433928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerDied","Data":"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18"} Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.434201 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50dc8a50-27b4-4160-af1a-f4b6507a0061","Type":"ContainerDied","Data":"2fb0089ddaf02d837a8db34e101027b8699d171ba4b172dcd1253a80a8abd1f1"} Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.433979 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.449940 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bmjmx" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.450489 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bmjmx" event={"ID":"a8db8bab-8ce9-4fdc-b237-f3e1126f92ab","Type":"ContainerDied","Data":"3ed88c19150e7a97b40fbb9492af2553d447bfa5f1b342542d7c219d1fca97d5"} Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463223 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm449\" (UniqueName: \"kubernetes.io/projected/50dc8a50-27b4-4160-af1a-f4b6507a0061-kube-api-access-pm449\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463259 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463270 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-run-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463279 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50dc8a50-27b4-4160-af1a-f4b6507a0061-log-httpd\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463289 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463297 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463595 4790 scope.go:117] "RemoveContainer" containerID="6bec3ee22ed7abe2881cffebf2e206c83a30b2c40c957ee46b7d98a401726b51" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.463756 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.470806 4790 generic.go:334] "Generic (PLEG): container finished" podID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerID="d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332" exitCode=0 Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.470866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerDied","Data":"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332"} Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.476552 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.492280 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.505605 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506019 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="extract-utilities" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506032 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="extract-utilities" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506047 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="proxy-httpd" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506054 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="proxy-httpd" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="sg-core" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="sg-core" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-log" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506096 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-log" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506108 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-api" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506113 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-api" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506121 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="extract-content" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506126 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="extract-content" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506140 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-central-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506146 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-central-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506158 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="registry-server" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506165 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="registry-server" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.506181 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-notification-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-notification-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506346 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" containerName="registry-server" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506354 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-log" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506367 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="sg-core" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506375 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-central-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506392 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="proxy-httpd" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506400 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" containerName="ceilometer-notification-agent" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.506415 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" containerName="nova-api-api" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.507621 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.517944 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.528714 4790 scope.go:117] "RemoveContainer" containerID="a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.530069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.530204 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.530353 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.533297 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bmjmx"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.535145 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data" (OuterVolumeSpecName: "config-data") pod "50dc8a50-27b4-4160-af1a-f4b6507a0061" (UID: "50dc8a50-27b4-4160-af1a-f4b6507a0061"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.551190 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.565796 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.566045 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50dc8a50-27b4-4160-af1a-f4b6507a0061-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.651937 4790 scope.go:117] "RemoveContainer" containerID="2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668415 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr5sf\" (UniqueName: \"kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.668469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.673742 4790 scope.go:117] "RemoveContainer" containerID="551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.685574 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937bfac1-c6d1-4e86-948f-a7c0144899bb" path="/var/lib/kubelet/pods/937bfac1-c6d1-4e86-948f-a7c0144899bb/volumes" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.686173 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8db8bab-8ce9-4fdc-b237-f3e1126f92ab" path="/var/lib/kubelet/pods/a8db8bab-8ce9-4fdc-b237-f3e1126f92ab/volumes" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.692736 4790 scope.go:117] "RemoveContainer" containerID="e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.715498 4790 scope.go:117] "RemoveContainer" containerID="a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.715883 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9\": container with ID starting with a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9 not found: ID does not exist" containerID="a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.715921 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9"} err="failed to get container status \"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9\": rpc error: code = NotFound desc = could not find container \"a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9\": container with ID starting with a9db9525e577a2aca53c98b5863f36b80a8c2503c9d6f88651f749b6576321d9 not found: ID does not exist" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.715943 4790 scope.go:117] "RemoveContainer" containerID="2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.716282 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53\": container with ID starting with 2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53 not found: ID does not exist" containerID="2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.716319 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53"} err="failed to get container status \"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53\": rpc error: code = NotFound desc = could not find container \"2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53\": container with ID starting with 2f07c7df2f95f9c584cd21d52c894b8f6d4d26967ab738e317caf87928d73d53 not found: ID does not exist" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.716332 4790 scope.go:117] "RemoveContainer" containerID="551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.716760 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18\": container with ID starting with 551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18 not found: ID does not exist" containerID="551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.716805 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18"} err="failed to get container status \"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18\": rpc error: code = NotFound desc = could not find container \"551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18\": container with ID starting with 551497cafbe0e8b00775da80849f681feee943e9542ab21b88c86ec8b0172d18 not found: ID does not exist" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.716837 4790 scope.go:117] "RemoveContainer" containerID="e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31" Apr 06 12:21:49 crc kubenswrapper[4790]: E0406 12:21:49.717158 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31\": container with ID starting with e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31 not found: ID does not exist" containerID="e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.717190 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31"} err="failed to get container status \"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31\": rpc error: code = NotFound desc = could not find container \"e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31\": container with ID starting with e519bafbac98ab3b2895afbba50aaa4d16cbef12bdf8ebee67bbb112d6284e31 not found: ID does not exist" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.717205 4790 scope.go:117] "RemoveContainer" containerID="eca3827520ec1135dfd0eb50c575661455582d1e2261b52ba8c132c879e42882" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.736369 4790 scope.go:117] "RemoveContainer" containerID="34cc097ca09840c06488489c5dfaf594d5345d66fa2001a458be42bc066b3df1" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.763005 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.769916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.770028 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.770064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.770118 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr5sf\" (UniqueName: \"kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.770145 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.770163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.771516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.774111 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.775020 4790 scope.go:117] "RemoveContainer" containerID="530ec5eb7584131d02f075c19ad9573c7111211f21e74417bcf8d37e16de0f75" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.775252 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.775608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.775930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.782386 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.798903 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr5sf\" (UniqueName: \"kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf\") pod \"nova-api-0\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.805656 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.808068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.812245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.812438 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.812554 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.822600 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.951731 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.975272 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.975351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-log-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.975399 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.975435 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-run-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.975822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-config-data\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.976007 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.976077 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsfl7\" (UniqueName: \"kubernetes.io/projected/6da6352b-82da-47b7-afe2-44baa8d546d3-kube-api-access-xsfl7\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:49 crc kubenswrapper[4790]: I0406 12:21:49.976224 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-scripts\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079176 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-config-data\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsfl7\" (UniqueName: \"kubernetes.io/projected/6da6352b-82da-47b7-afe2-44baa8d546d3-kube-api-access-xsfl7\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079587 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-scripts\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-log-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.079702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-run-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.080186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-run-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.080681 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6da6352b-82da-47b7-afe2-44baa8d546d3-log-httpd\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.092285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.093232 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-config-data\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.093249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.098452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsfl7\" (UniqueName: \"kubernetes.io/projected/6da6352b-82da-47b7-afe2-44baa8d546d3-kube-api-access-xsfl7\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.098830 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-scripts\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.100998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da6352b-82da-47b7-afe2-44baa8d546d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6da6352b-82da-47b7-afe2-44baa8d546d3\") " pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.213949 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.417667 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.494982 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerStarted","Data":"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f"} Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.508758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerStarted","Data":"bdbed44fa21784456c1ba6d19d07f58421f7c6ee9fa5f6f24f3cd65edadd201d"} Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.516963 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5tg2" podStartSLOduration=2.949574706 podStartE2EDuration="5.516944757s" podCreationTimestamp="2026-04-06 12:21:45 +0000 UTC" firstStartedPulling="2026-04-06 12:21:47.373015039 +0000 UTC m=+1486.360757905" lastFinishedPulling="2026-04-06 12:21:49.94038509 +0000 UTC m=+1488.928127956" observedRunningTime="2026-04-06 12:21:50.51113426 +0000 UTC m=+1489.498877126" watchObservedRunningTime="2026-04-06 12:21:50.516944757 +0000 UTC m=+1489.504687643" Apr 06 12:21:50 crc kubenswrapper[4790]: W0406 12:21:50.749461 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da6352b_82da_47b7_afe2_44baa8d546d3.slice/crio-f7f7c1b2f9aab173d045857947ae7568d8cc8f33d3561f4591acba8657b17d63 WatchSource:0}: Error finding container f7f7c1b2f9aab173d045857947ae7568d8cc8f33d3561f4591acba8657b17d63: Status 404 returned error can't find the container with id f7f7c1b2f9aab173d045857947ae7568d8cc8f33d3561f4591acba8657b17d63 Apr 06 12:21:50 crc kubenswrapper[4790]: I0406 12:21:50.750983 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.522925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerStarted","Data":"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c"} Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.523239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerStarted","Data":"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1"} Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.527154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6da6352b-82da-47b7-afe2-44baa8d546d3","Type":"ContainerStarted","Data":"3cced9f6b66afcbfe70d5a56b3888588deff871adcf2b405cf48bd07be0ea1e3"} Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.527197 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6da6352b-82da-47b7-afe2-44baa8d546d3","Type":"ContainerStarted","Data":"7fb354e5c832acca84dbd82b1aee4e767380f3dc41428c3be7d7ac2a164bbd8c"} Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.527210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6da6352b-82da-47b7-afe2-44baa8d546d3","Type":"ContainerStarted","Data":"f7f7c1b2f9aab173d045857947ae7568d8cc8f33d3561f4591acba8657b17d63"} Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.556538 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.556522697 podStartE2EDuration="2.556522697s" podCreationTimestamp="2026-04-06 12:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:51.540612871 +0000 UTC m=+1490.528355747" watchObservedRunningTime="2026-04-06 12:21:51.556522697 +0000 UTC m=+1490.544265563" Apr 06 12:21:51 crc kubenswrapper[4790]: I0406 12:21:51.690655 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50dc8a50-27b4-4160-af1a-f4b6507a0061" path="/var/lib/kubelet/pods/50dc8a50-27b4-4160-af1a-f4b6507a0061/volumes" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.501136 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.503528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.517536 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.553324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6da6352b-82da-47b7-afe2-44baa8d546d3","Type":"ContainerStarted","Data":"f003052989179592f05b63e7405e817551cffc7d7eeae7410f3e19eba648e2a8"} Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.633458 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.633533 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbdg\" (UniqueName: \"kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.633560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.735477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.735581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbdg\" (UniqueName: \"kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.735615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.736136 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.736355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.756519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbdg\" (UniqueName: \"kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg\") pod \"certified-operators-7rgpj\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:52 crc kubenswrapper[4790]: I0406 12:21:52.840347 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:21:53 crc kubenswrapper[4790]: I0406 12:21:53.283323 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:21:53 crc kubenswrapper[4790]: W0406 12:21:53.285079 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32885ce5_2f9b_4e9b_9850_8a03bfc26c41.slice/crio-1f4c6e5c5bfc3be289aaa5b2489131e804cfa8a8f260acb0b9742118b486d6e7 WatchSource:0}: Error finding container 1f4c6e5c5bfc3be289aaa5b2489131e804cfa8a8f260acb0b9742118b486d6e7: Status 404 returned error can't find the container with id 1f4c6e5c5bfc3be289aaa5b2489131e804cfa8a8f260acb0b9742118b486d6e7 Apr 06 12:21:53 crc kubenswrapper[4790]: I0406 12:21:53.566070 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerStarted","Data":"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724"} Apr 06 12:21:53 crc kubenswrapper[4790]: I0406 12:21:53.566112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerStarted","Data":"1f4c6e5c5bfc3be289aaa5b2489131e804cfa8a8f260acb0b9742118b486d6e7"} Apr 06 12:21:53 crc kubenswrapper[4790]: I0406 12:21:53.972734 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.014920 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.577916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6da6352b-82da-47b7-afe2-44baa8d546d3","Type":"ContainerStarted","Data":"e8d7a89a8303bb73662f5f9220962685a7263553d8c64a81530e30244e1efedf"} Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.578502 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.580772 4790 generic.go:334] "Generic (PLEG): container finished" podID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerID="26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724" exitCode=0 Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.581196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerDied","Data":"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724"} Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.605509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.611400 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.435218987 podStartE2EDuration="5.611383556s" podCreationTimestamp="2026-04-06 12:21:49 +0000 UTC" firstStartedPulling="2026-04-06 12:21:50.751645329 +0000 UTC m=+1489.739388195" lastFinishedPulling="2026-04-06 12:21:53.927809898 +0000 UTC m=+1492.915552764" observedRunningTime="2026-04-06 12:21:54.599281979 +0000 UTC m=+1493.587024845" watchObservedRunningTime="2026-04-06 12:21:54.611383556 +0000 UTC m=+1493.599126422" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.785684 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-lsjv7"] Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.787581 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.791003 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.791013 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.798766 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lsjv7"] Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.812102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwk97\" (UniqueName: \"kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.812175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.812485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.812666 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.914363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.914445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.914511 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwk97\" (UniqueName: \"kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.914579 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.920342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.920444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.933014 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwk97\" (UniqueName: \"kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.934402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts\") pod \"nova-cell1-cell-mapping-lsjv7\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:54 crc kubenswrapper[4790]: I0406 12:21:54.945169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.022649 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.022913 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="dnsmasq-dns" containerID="cri-o://147f43dc636c6b37d86785519d3fcf698b9667dc90f96241a2733289c2e4b9c2" gracePeriod=10 Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.113362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.601780 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerID="147f43dc636c6b37d86785519d3fcf698b9667dc90f96241a2733289c2e4b9c2" exitCode=0 Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.602196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" event={"ID":"5ec42c47-2162-48fc-abfc-03537b0f2e1a","Type":"ContainerDied","Data":"147f43dc636c6b37d86785519d3fcf698b9667dc90f96241a2733289c2e4b9c2"} Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.653153 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.653479 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.701820 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.717415 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.838907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.838947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.839000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfr2s\" (UniqueName: \"kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.839081 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.839133 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.839244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb\") pod \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\" (UID: \"5ec42c47-2162-48fc-abfc-03537b0f2e1a\") " Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.853015 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s" (OuterVolumeSpecName: "kube-api-access-vfr2s") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "kube-api-access-vfr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.908587 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.916542 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lsjv7"] Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.919511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config" (OuterVolumeSpecName: "config") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.934186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.941745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942027 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfr2s\" (UniqueName: \"kubernetes.io/projected/5ec42c47-2162-48fc-abfc-03537b0f2e1a-kube-api-access-vfr2s\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942047 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942058 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942068 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942077 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:55 crc kubenswrapper[4790]: I0406 12:21:55.942561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ec42c47-2162-48fc-abfc-03537b0f2e1a" (UID: "5ec42c47-2162-48fc-abfc-03537b0f2e1a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:21:55 crc kubenswrapper[4790]: W0406 12:21:55.953438 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf08c5050_9091_47f9_8135_daee3777de99.slice/crio-3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb WatchSource:0}: Error finding container 3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb: Status 404 returned error can't find the container with id 3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.043845 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ec42c47-2162-48fc-abfc-03537b0f2e1a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.632646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" event={"ID":"5ec42c47-2162-48fc-abfc-03537b0f2e1a","Type":"ContainerDied","Data":"e81eddd4ec8db71f3e580d87cade25660e08eec7ef6796d245d5fde2f942cb51"} Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.632691 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b5ff8cc-fpp68" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.632702 4790 scope.go:117] "RemoveContainer" containerID="147f43dc636c6b37d86785519d3fcf698b9667dc90f96241a2733289c2e4b9c2" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.660699 4790 generic.go:334] "Generic (PLEG): container finished" podID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerID="906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9" exitCode=0 Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.661235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerDied","Data":"906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9"} Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.672186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lsjv7" event={"ID":"f08c5050-9091-47f9-8135-daee3777de99","Type":"ContainerStarted","Data":"8bb6d004bb3e2a1d714b20f740824914099e2ff479a2ae3efb76a449c125e02a"} Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.672273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lsjv7" event={"ID":"f08c5050-9091-47f9-8135-daee3777de99","Type":"ContainerStarted","Data":"3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb"} Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.679194 4790 scope.go:117] "RemoveContainer" containerID="5205ca765bf34eb54da7c711160ed115c542923289970ad2157eb7217a693170" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.727417 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-lsjv7" podStartSLOduration=2.727397935 podStartE2EDuration="2.727397935s" podCreationTimestamp="2026-04-06 12:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:21:56.723296637 +0000 UTC m=+1495.711039503" watchObservedRunningTime="2026-04-06 12:21:56.727397935 +0000 UTC m=+1495.715140801" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.776812 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.827932 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:56 crc kubenswrapper[4790]: I0406 12:21:56.840411 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b5ff8cc-fpp68"] Apr 06 12:21:57 crc kubenswrapper[4790]: I0406 12:21:57.698880 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" path="/var/lib/kubelet/pods/5ec42c47-2162-48fc-abfc-03537b0f2e1a/volumes" Apr 06 12:21:57 crc kubenswrapper[4790]: I0406 12:21:57.700149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerStarted","Data":"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9"} Apr 06 12:21:58 crc kubenswrapper[4790]: I0406 12:21:58.720141 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7rgpj" podStartSLOduration=4.113892297 podStartE2EDuration="6.720112326s" podCreationTimestamp="2026-04-06 12:21:52 +0000 UTC" firstStartedPulling="2026-04-06 12:21:54.582475167 +0000 UTC m=+1493.570218033" lastFinishedPulling="2026-04-06 12:21:57.188695196 +0000 UTC m=+1496.176438062" observedRunningTime="2026-04-06 12:21:57.726766471 +0000 UTC m=+1496.714509347" watchObservedRunningTime="2026-04-06 12:21:58.720112326 +0000 UTC m=+1497.707855232" Apr 06 12:21:58 crc kubenswrapper[4790]: I0406 12:21:58.725361 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:21:59 crc kubenswrapper[4790]: I0406 12:21:59.719633 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5tg2" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="registry-server" containerID="cri-o://c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f" gracePeriod=2 Apr 06 12:21:59 crc kubenswrapper[4790]: I0406 12:21:59.952603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:21:59 crc kubenswrapper[4790]: I0406 12:21:59.952985 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.145706 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591302-pkz6k"] Apr 06 12:22:00 crc kubenswrapper[4790]: E0406 12:22:00.146131 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="init" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.146142 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="init" Apr 06 12:22:00 crc kubenswrapper[4790]: E0406 12:22:00.146151 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="dnsmasq-dns" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.146157 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="dnsmasq-dns" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.146338 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec42c47-2162-48fc-abfc-03537b0f2e1a" containerName="dnsmasq-dns" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.146956 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.149319 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.149469 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.149608 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.153803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwkh\" (UniqueName: \"kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh\") pod \"auto-csr-approver-29591302-pkz6k\" (UID: \"36fcc986-c077-4dee-a97a-7c3a92bd31d5\") " pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.170270 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591302-pkz6k"] Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.241173 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.258745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities\") pod \"305f1532-8547-4e7a-a6f3-2a86f8692112\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.258860 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content\") pod \"305f1532-8547-4e7a-a6f3-2a86f8692112\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.259003 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8lws\" (UniqueName: \"kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws\") pod \"305f1532-8547-4e7a-a6f3-2a86f8692112\" (UID: \"305f1532-8547-4e7a-a6f3-2a86f8692112\") " Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.259556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwkh\" (UniqueName: \"kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh\") pod \"auto-csr-approver-29591302-pkz6k\" (UID: \"36fcc986-c077-4dee-a97a-7c3a92bd31d5\") " pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.262209 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities" (OuterVolumeSpecName: "utilities") pod "305f1532-8547-4e7a-a6f3-2a86f8692112" (UID: "305f1532-8547-4e7a-a6f3-2a86f8692112"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.267645 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws" (OuterVolumeSpecName: "kube-api-access-c8lws") pod "305f1532-8547-4e7a-a6f3-2a86f8692112" (UID: "305f1532-8547-4e7a-a6f3-2a86f8692112"). InnerVolumeSpecName "kube-api-access-c8lws". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.274498 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwkh\" (UniqueName: \"kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh\") pod \"auto-csr-approver-29591302-pkz6k\" (UID: \"36fcc986-c077-4dee-a97a-7c3a92bd31d5\") " pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.328929 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "305f1532-8547-4e7a-a6f3-2a86f8692112" (UID: "305f1532-8547-4e7a-a6f3-2a86f8692112"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.363046 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.363082 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305f1532-8547-4e7a-a6f3-2a86f8692112-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.363097 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8lws\" (UniqueName: \"kubernetes.io/projected/305f1532-8547-4e7a-a6f3-2a86f8692112-kube-api-access-c8lws\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.551769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.739040 4790 generic.go:334] "Generic (PLEG): container finished" podID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerID="c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f" exitCode=0 Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.739095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerDied","Data":"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f"} Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.739131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5tg2" event={"ID":"305f1532-8547-4e7a-a6f3-2a86f8692112","Type":"ContainerDied","Data":"3873ad880d3f0d0db7c3610f2f29f7666702fc043c29dea4fb6b9e2f371e4dd4"} Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.739305 4790 scope.go:117] "RemoveContainer" containerID="c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.739534 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5tg2" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.784094 4790 scope.go:117] "RemoveContainer" containerID="d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.795001 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.807361 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5tg2"] Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.812962 4790 scope.go:117] "RemoveContainer" containerID="fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.835604 4790 scope.go:117] "RemoveContainer" containerID="c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f" Apr 06 12:22:00 crc kubenswrapper[4790]: E0406 12:22:00.836132 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f\": container with ID starting with c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f not found: ID does not exist" containerID="c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.836166 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f"} err="failed to get container status \"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f\": rpc error: code = NotFound desc = could not find container \"c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f\": container with ID starting with c0046ede47952b323e9a036805bd0f127ec134f2c825c4df5d4bfda827c5fc7f not found: ID does not exist" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.836190 4790 scope.go:117] "RemoveContainer" containerID="d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332" Apr 06 12:22:00 crc kubenswrapper[4790]: E0406 12:22:00.836538 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332\": container with ID starting with d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332 not found: ID does not exist" containerID="d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.836580 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332"} err="failed to get container status \"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332\": rpc error: code = NotFound desc = could not find container \"d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332\": container with ID starting with d33e0e734836b93b7903591715d96b80743c7848c7c6882a8e9f416386b9d332 not found: ID does not exist" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.836603 4790 scope.go:117] "RemoveContainer" containerID="fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741" Apr 06 12:22:00 crc kubenswrapper[4790]: E0406 12:22:00.837039 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741\": container with ID starting with fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741 not found: ID does not exist" containerID="fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.837112 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741"} err="failed to get container status \"fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741\": rpc error: code = NotFound desc = could not find container \"fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741\": container with ID starting with fef680459caf13803b145599e1530645c1b0694975c0a971192e7d0638361741 not found: ID does not exist" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.965963 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.244:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:00 crc kubenswrapper[4790]: I0406 12:22:00.965963 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.244:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:01 crc kubenswrapper[4790]: W0406 12:22:01.011846 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36fcc986_c077_4dee_a97a_7c3a92bd31d5.slice/crio-be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc WatchSource:0}: Error finding container be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc: Status 404 returned error can't find the container with id be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc Apr 06 12:22:01 crc kubenswrapper[4790]: I0406 12:22:01.012367 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591302-pkz6k"] Apr 06 12:22:01 crc kubenswrapper[4790]: I0406 12:22:01.694249 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" path="/var/lib/kubelet/pods/305f1532-8547-4e7a-a6f3-2a86f8692112/volumes" Apr 06 12:22:01 crc kubenswrapper[4790]: I0406 12:22:01.752069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" event={"ID":"36fcc986-c077-4dee-a97a-7c3a92bd31d5","Type":"ContainerStarted","Data":"be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc"} Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.763145 4790 generic.go:334] "Generic (PLEG): container finished" podID="f08c5050-9091-47f9-8135-daee3777de99" containerID="8bb6d004bb3e2a1d714b20f740824914099e2ff479a2ae3efb76a449c125e02a" exitCode=0 Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.763346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lsjv7" event={"ID":"f08c5050-9091-47f9-8135-daee3777de99","Type":"ContainerDied","Data":"8bb6d004bb3e2a1d714b20f740824914099e2ff479a2ae3efb76a449c125e02a"} Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.766479 4790 generic.go:334] "Generic (PLEG): container finished" podID="36fcc986-c077-4dee-a97a-7c3a92bd31d5" containerID="22553c5d47b4b299883bcd9557efa9e0b15016e7dcac44635dc86036de097a54" exitCode=0 Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.766608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" event={"ID":"36fcc986-c077-4dee-a97a-7c3a92bd31d5","Type":"ContainerDied","Data":"22553c5d47b4b299883bcd9557efa9e0b15016e7dcac44635dc86036de097a54"} Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.841073 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.841136 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:02 crc kubenswrapper[4790]: I0406 12:22:02.895795 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:03 crc kubenswrapper[4790]: I0406 12:22:03.858090 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.253447 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.259292 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.298714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.364695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts\") pod \"f08c5050-9091-47f9-8135-daee3777de99\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.364899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptwkh\" (UniqueName: \"kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh\") pod \"36fcc986-c077-4dee-a97a-7c3a92bd31d5\" (UID: \"36fcc986-c077-4dee-a97a-7c3a92bd31d5\") " Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.364948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data\") pod \"f08c5050-9091-47f9-8135-daee3777de99\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.365023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle\") pod \"f08c5050-9091-47f9-8135-daee3777de99\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.365077 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwk97\" (UniqueName: \"kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97\") pod \"f08c5050-9091-47f9-8135-daee3777de99\" (UID: \"f08c5050-9091-47f9-8135-daee3777de99\") " Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.370753 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts" (OuterVolumeSpecName: "scripts") pod "f08c5050-9091-47f9-8135-daee3777de99" (UID: "f08c5050-9091-47f9-8135-daee3777de99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.371190 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh" (OuterVolumeSpecName: "kube-api-access-ptwkh") pod "36fcc986-c077-4dee-a97a-7c3a92bd31d5" (UID: "36fcc986-c077-4dee-a97a-7c3a92bd31d5"). InnerVolumeSpecName "kube-api-access-ptwkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.371225 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97" (OuterVolumeSpecName: "kube-api-access-pwk97") pod "f08c5050-9091-47f9-8135-daee3777de99" (UID: "f08c5050-9091-47f9-8135-daee3777de99"). InnerVolumeSpecName "kube-api-access-pwk97". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.403673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data" (OuterVolumeSpecName: "config-data") pod "f08c5050-9091-47f9-8135-daee3777de99" (UID: "f08c5050-9091-47f9-8135-daee3777de99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.404727 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08c5050-9091-47f9-8135-daee3777de99" (UID: "f08c5050-9091-47f9-8135-daee3777de99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.467956 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwk97\" (UniqueName: \"kubernetes.io/projected/f08c5050-9091-47f9-8135-daee3777de99-kube-api-access-pwk97\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.467989 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-scripts\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.468000 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptwkh\" (UniqueName: \"kubernetes.io/projected/36fcc986-c077-4dee-a97a-7c3a92bd31d5-kube-api-access-ptwkh\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.468009 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.468018 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c5050-9091-47f9-8135-daee3777de99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.786363 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" event={"ID":"36fcc986-c077-4dee-a97a-7c3a92bd31d5","Type":"ContainerDied","Data":"be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc"} Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.786430 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4ac52a6bb80356cde74ae9289e699ec03da747652b8d7e0866a3f37b889fcc" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.786379 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591302-pkz6k" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.788436 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lsjv7" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.788426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lsjv7" event={"ID":"f08c5050-9091-47f9-8135-daee3777de99","Type":"ContainerDied","Data":"3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb"} Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.788506 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e8efb1962ac5b50dc4beaf880f90d11aa9db4e85ced0aea04024e03b1bf28eb" Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.989868 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.990185 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-log" containerID="cri-o://bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1" gracePeriod=30 Apr 06 12:22:04 crc kubenswrapper[4790]: I0406 12:22:04.990224 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-api" containerID="cri-o://02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c" gracePeriod=30 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.008100 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.008497 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerName="nova-scheduler-scheduler" containerID="cri-o://e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" gracePeriod=30 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.025197 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.025484 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-log" containerID="cri-o://c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c" gracePeriod=30 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.026348 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-metadata" containerID="cri-o://67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda" gracePeriod=30 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.327256 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591296-klcbk"] Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.336309 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591296-klcbk"] Apr 06 12:22:05 crc kubenswrapper[4790]: E0406 12:22:05.449401 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:22:05 crc kubenswrapper[4790]: E0406 12:22:05.451433 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:22:05 crc kubenswrapper[4790]: E0406 12:22:05.453009 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Apr 06 12:22:05 crc kubenswrapper[4790]: E0406 12:22:05.453051 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerName="nova-scheduler-scheduler" Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.685880 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e2691a-d43d-4851-89f9-8d4aefbaa5cf" path="/var/lib/kubelet/pods/c7e2691a-d43d-4851-89f9-8d4aefbaa5cf/volumes" Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.802665 4790 generic.go:334] "Generic (PLEG): container finished" podID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerID="c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c" exitCode=143 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.802747 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerDied","Data":"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c"} Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.804453 4790 generic.go:334] "Generic (PLEG): container finished" podID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerID="bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1" exitCode=143 Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.804504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerDied","Data":"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1"} Apr 06 12:22:05 crc kubenswrapper[4790]: I0406 12:22:05.804813 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7rgpj" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="registry-server" containerID="cri-o://be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9" gracePeriod=2 Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.447656 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.562101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.569546 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.624622 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr5sf\" (UniqueName: \"kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.624765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.624874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.624913 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.624942 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.625016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle\") pod \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\" (UID: \"ecf07ee5-7448-4df6-9329-e2fc9b7989de\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.625706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs" (OuterVolumeSpecName: "logs") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.646033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf" (OuterVolumeSpecName: "kube-api-access-sr5sf") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "kube-api-access-sr5sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.656499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.662437 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data" (OuterVolumeSpecName: "config-data") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.681530 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.692721 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecf07ee5-7448-4df6-9329-e2fc9b7989de" (UID: "ecf07ee5-7448-4df6-9329-e2fc9b7989de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle\") pod \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp865\" (UniqueName: \"kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865\") pod \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs\") pod \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities\") pod \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726428 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs\") pod \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726544 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rbdg\" (UniqueName: \"kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg\") pod \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content\") pod \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\" (UID: \"32885ce5-2f9b-4e9b-9850-8a03bfc26c41\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.726647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data\") pod \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\" (UID: \"92d8e1ab-19ef-4049-aaae-4b2f692730ca\") " Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727135 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727162 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727177 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf07ee5-7448-4df6-9329-e2fc9b7989de-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727190 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727201 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf07ee5-7448-4df6-9329-e2fc9b7989de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727213 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr5sf\" (UniqueName: \"kubernetes.io/projected/ecf07ee5-7448-4df6-9329-e2fc9b7989de-kube-api-access-sr5sf\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.727886 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs" (OuterVolumeSpecName: "logs") pod "92d8e1ab-19ef-4049-aaae-4b2f692730ca" (UID: "92d8e1ab-19ef-4049-aaae-4b2f692730ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.728428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities" (OuterVolumeSpecName: "utilities") pod "32885ce5-2f9b-4e9b-9850-8a03bfc26c41" (UID: "32885ce5-2f9b-4e9b-9850-8a03bfc26c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.729845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865" (OuterVolumeSpecName: "kube-api-access-qp865") pod "92d8e1ab-19ef-4049-aaae-4b2f692730ca" (UID: "92d8e1ab-19ef-4049-aaae-4b2f692730ca"). InnerVolumeSpecName "kube-api-access-qp865". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.732160 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg" (OuterVolumeSpecName: "kube-api-access-2rbdg") pod "32885ce5-2f9b-4e9b-9850-8a03bfc26c41" (UID: "32885ce5-2f9b-4e9b-9850-8a03bfc26c41"). InnerVolumeSpecName "kube-api-access-2rbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.763028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data" (OuterVolumeSpecName: "config-data") pod "92d8e1ab-19ef-4049-aaae-4b2f692730ca" (UID: "92d8e1ab-19ef-4049-aaae-4b2f692730ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.765412 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92d8e1ab-19ef-4049-aaae-4b2f692730ca" (UID: "92d8e1ab-19ef-4049-aaae-4b2f692730ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.774725 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "92d8e1ab-19ef-4049-aaae-4b2f692730ca" (UID: "92d8e1ab-19ef-4049-aaae-4b2f692730ca"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.789212 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32885ce5-2f9b-4e9b-9850-8a03bfc26c41" (UID: "32885ce5-2f9b-4e9b-9850-8a03bfc26c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.816847 4790 generic.go:334] "Generic (PLEG): container finished" podID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerID="be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9" exitCode=0 Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.816899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerDied","Data":"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.816923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rgpj" event={"ID":"32885ce5-2f9b-4e9b-9850-8a03bfc26c41","Type":"ContainerDied","Data":"1f4c6e5c5bfc3be289aaa5b2489131e804cfa8a8f260acb0b9742118b486d6e7"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.816939 4790 scope.go:117] "RemoveContainer" containerID="be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.817060 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rgpj" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.822319 4790 generic.go:334] "Generic (PLEG): container finished" podID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerID="02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c" exitCode=0 Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.822369 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerDied","Data":"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.822393 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecf07ee5-7448-4df6-9329-e2fc9b7989de","Type":"ContainerDied","Data":"bdbed44fa21784456c1ba6d19d07f58421f7c6ee9fa5f6f24f3cd65edadd201d"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.822446 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.826479 4790 generic.go:334] "Generic (PLEG): container finished" podID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerID="67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda" exitCode=0 Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.826531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerDied","Data":"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.826553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"92d8e1ab-19ef-4049-aaae-4b2f692730ca","Type":"ContainerDied","Data":"39d7b605634002dd238afa2ae5bb11262261c71cf47d1276d76da6cde6c91f22"} Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.826619 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.828980 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829004 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829020 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829032 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp865\" (UniqueName: \"kubernetes.io/projected/92d8e1ab-19ef-4049-aaae-4b2f692730ca-kube-api-access-qp865\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829042 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/92d8e1ab-19ef-4049-aaae-4b2f692730ca-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829051 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829058 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92d8e1ab-19ef-4049-aaae-4b2f692730ca-logs\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.829067 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rbdg\" (UniqueName: \"kubernetes.io/projected/32885ce5-2f9b-4e9b-9850-8a03bfc26c41-kube-api-access-2rbdg\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.859959 4790 scope.go:117] "RemoveContainer" containerID="906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.866111 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.878435 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7rgpj"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.889437 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.902066 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.922476 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.949155 4790 scope.go:117] "RemoveContainer" containerID="26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.961094 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973283 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973748 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="extract-utilities" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973769 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="extract-utilities" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973780 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c5050-9091-47f9-8135-daee3777de99" containerName="nova-manage" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973822 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c5050-9091-47f9-8135-daee3777de99" containerName="nova-manage" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973861 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-log" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973867 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-log" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973902 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-api" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973908 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-api" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973920 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="extract-content" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973926 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="extract-content" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973936 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-metadata" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973942 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-metadata" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973952 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973977 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.973991 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-log" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.973999 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-log" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.974009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="extract-utilities" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974016 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="extract-utilities" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.974031 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="extract-content" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974056 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="extract-content" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.974069 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974075 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.974084 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fcc986-c077-4dee-a97a-7c3a92bd31d5" containerName="oc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974089 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fcc986-c077-4dee-a97a-7c3a92bd31d5" containerName="oc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974337 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-log" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974376 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974388 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-api" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974403 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="305f1532-8547-4e7a-a6f3-2a86f8692112" containerName="registry-server" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c5050-9091-47f9-8135-daee3777de99" containerName="nova-manage" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974417 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" containerName="nova-metadata-metadata" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974424 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fcc986-c077-4dee-a97a-7c3a92bd31d5" containerName="oc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.974432 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" containerName="nova-api-log" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.975941 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.978130 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.978362 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.978591 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.979550 4790 scope.go:117] "RemoveContainer" containerID="be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.980073 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9\": container with ID starting with be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9 not found: ID does not exist" containerID="be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980101 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9"} err="failed to get container status \"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9\": rpc error: code = NotFound desc = could not find container \"be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9\": container with ID starting with be8234ad036c125ec2bd3658dc0633002a9ea9c74bde639903dfc7437102c0c9 not found: ID does not exist" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980124 4790 scope.go:117] "RemoveContainer" containerID="906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.980489 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9\": container with ID starting with 906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9 not found: ID does not exist" containerID="906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980532 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9"} err="failed to get container status \"906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9\": rpc error: code = NotFound desc = could not find container \"906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9\": container with ID starting with 906ca36f527475384d0309b52ae3c99e41c7d0daa4c698bc74453c48eee4ecb9 not found: ID does not exist" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980561 4790 scope.go:117] "RemoveContainer" containerID="26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724" Apr 06 12:22:06 crc kubenswrapper[4790]: E0406 12:22:06.980954 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724\": container with ID starting with 26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724 not found: ID does not exist" containerID="26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980982 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724"} err="failed to get container status \"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724\": rpc error: code = NotFound desc = could not find container \"26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724\": container with ID starting with 26d2543f1dfb752908476bda3ecf737d802f0b222e7bd6e8297a0c7ac76dc724 not found: ID does not exist" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.980997 4790 scope.go:117] "RemoveContainer" containerID="02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.985489 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.988028 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.989326 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Apr 06 12:22:06 crc kubenswrapper[4790]: I0406 12:22:06.990316 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.005527 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.008190 4790 scope.go:117] "RemoveContainer" containerID="bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.019343 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.026983 4790 scope.go:117] "RemoveContainer" containerID="02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c" Apr 06 12:22:07 crc kubenswrapper[4790]: E0406 12:22:07.027450 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c\": container with ID starting with 02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c not found: ID does not exist" containerID="02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.027569 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c"} err="failed to get container status \"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c\": rpc error: code = NotFound desc = could not find container \"02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c\": container with ID starting with 02977d49416571762ad7a9f2c23b339a9e12ea2900ba25fc6a7336fc6f26058c not found: ID does not exist" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.027594 4790 scope.go:117] "RemoveContainer" containerID="bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1" Apr 06 12:22:07 crc kubenswrapper[4790]: E0406 12:22:07.028020 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1\": container with ID starting with bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1 not found: ID does not exist" containerID="bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.028047 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1"} err="failed to get container status \"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1\": rpc error: code = NotFound desc = could not find container \"bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1\": container with ID starting with bccc11ce4afe9e5f9f616b9ff217db5fd729e56431c992e7f68dfbeb14995cf1 not found: ID does not exist" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.028065 4790 scope.go:117] "RemoveContainer" containerID="67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.052539 4790 scope.go:117] "RemoveContainer" containerID="c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.074810 4790 scope.go:117] "RemoveContainer" containerID="67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda" Apr 06 12:22:07 crc kubenswrapper[4790]: E0406 12:22:07.075718 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda\": container with ID starting with 67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda not found: ID does not exist" containerID="67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.075763 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda"} err="failed to get container status \"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda\": rpc error: code = NotFound desc = could not find container \"67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda\": container with ID starting with 67c7c7d336f65aa772faf066480da3739165e50b37d9cf98edd7609d5510eeda not found: ID does not exist" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.075793 4790 scope.go:117] "RemoveContainer" containerID="c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c" Apr 06 12:22:07 crc kubenswrapper[4790]: E0406 12:22:07.076205 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c\": container with ID starting with c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c not found: ID does not exist" containerID="c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.076311 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c"} err="failed to get container status \"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c\": rpc error: code = NotFound desc = could not find container \"c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c\": container with ID starting with c5ce5510995f435a20f584f1d557f5ebc3cafcc7b662868d8ea1659a33e28e6c not found: ID does not exist" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.136579 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-config-data\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.136847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137131 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/8fa33d66-ad99-4650-bc60-a97e16cbd064-kube-api-access-4t5ts\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa33d66-ad99-4650-bc60-a97e16cbd064-logs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q7m\" (UniqueName: \"kubernetes.io/projected/2cd7a2b3-4c64-4d09-9865-cd55277fd369-kube-api-access-m5q7m\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd7a2b3-4c64-4d09-9865-cd55277fd369-logs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.137958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-config-data\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd7a2b3-4c64-4d09-9865-cd55277fd369-logs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-config-data\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239722 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-config-data\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239777 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239815 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/8fa33d66-ad99-4650-bc60-a97e16cbd064-kube-api-access-4t5ts\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239859 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa33d66-ad99-4650-bc60-a97e16cbd064-logs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.239916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q7m\" (UniqueName: \"kubernetes.io/projected/2cd7a2b3-4c64-4d09-9865-cd55277fd369-kube-api-access-m5q7m\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.241137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa33d66-ad99-4650-bc60-a97e16cbd064-logs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.241970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd7a2b3-4c64-4d09-9865-cd55277fd369-logs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.243781 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.244431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.244649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.245535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-config-data\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.246061 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd7a2b3-4c64-4d09-9865-cd55277fd369-config-data\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.247977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.251491 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa33d66-ad99-4650-bc60-a97e16cbd064-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.256399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/8fa33d66-ad99-4650-bc60-a97e16cbd064-kube-api-access-4t5ts\") pod \"nova-metadata-0\" (UID: \"8fa33d66-ad99-4650-bc60-a97e16cbd064\") " pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.260266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q7m\" (UniqueName: \"kubernetes.io/projected/2cd7a2b3-4c64-4d09-9865-cd55277fd369-kube-api-access-m5q7m\") pod \"nova-api-0\" (UID: \"2cd7a2b3-4c64-4d09-9865-cd55277fd369\") " pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.300070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.308542 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.687517 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32885ce5-2f9b-4e9b-9850-8a03bfc26c41" path="/var/lib/kubelet/pods/32885ce5-2f9b-4e9b-9850-8a03bfc26c41/volumes" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.688624 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d8e1ab-19ef-4049-aaae-4b2f692730ca" path="/var/lib/kubelet/pods/92d8e1ab-19ef-4049-aaae-4b2f692730ca/volumes" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.689459 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf07ee5-7448-4df6-9329-e2fc9b7989de" path="/var/lib/kubelet/pods/ecf07ee5-7448-4df6-9329-e2fc9b7989de/volumes" Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.773605 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Apr 06 12:22:07 crc kubenswrapper[4790]: W0406 12:22:07.788160 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fa33d66_ad99_4650_bc60_a97e16cbd064.slice/crio-65e6517a52363a495a4df6cbaa48b1d6d4f3ea38dc7b22271133e1b2e735c1da WatchSource:0}: Error finding container 65e6517a52363a495a4df6cbaa48b1d6d4f3ea38dc7b22271133e1b2e735c1da: Status 404 returned error can't find the container with id 65e6517a52363a495a4df6cbaa48b1d6d4f3ea38dc7b22271133e1b2e735c1da Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.794067 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.840125 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8fa33d66-ad99-4650-bc60-a97e16cbd064","Type":"ContainerStarted","Data":"65e6517a52363a495a4df6cbaa48b1d6d4f3ea38dc7b22271133e1b2e735c1da"} Apr 06 12:22:07 crc kubenswrapper[4790]: I0406 12:22:07.842331 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd7a2b3-4c64-4d09-9865-cd55277fd369","Type":"ContainerStarted","Data":"4a4d19cb8ec9d29def82330dac1cd811b7a61ca308203e64aca110b51b411f07"} Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.860208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd7a2b3-4c64-4d09-9865-cd55277fd369","Type":"ContainerStarted","Data":"4270c2e410e027a08dbc62412bd429ebcf9973231d78a1d588e9ff58c7080e6f"} Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.860604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cd7a2b3-4c64-4d09-9865-cd55277fd369","Type":"ContainerStarted","Data":"bfdd20ff818d1483882237f963ffc87363a788270790b377e90476600a7a3838"} Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.862806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8fa33d66-ad99-4650-bc60-a97e16cbd064","Type":"ContainerStarted","Data":"a8864e5e247eec6f9c590758ae35afd1531f5338e42b5dfa599e680920febc82"} Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.862854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8fa33d66-ad99-4650-bc60-a97e16cbd064","Type":"ContainerStarted","Data":"ee25885d636578d3ac05d3e8c54e7f37d69d353d0acc0c8af488d1aaa01952ef"} Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.900437 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.900412239 podStartE2EDuration="2.900412239s" podCreationTimestamp="2026-04-06 12:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:22:08.883967687 +0000 UTC m=+1507.871710583" watchObservedRunningTime="2026-04-06 12:22:08.900412239 +0000 UTC m=+1507.888155145" Apr 06 12:22:08 crc kubenswrapper[4790]: I0406 12:22:08.920669 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.920622908 podStartE2EDuration="2.920622908s" podCreationTimestamp="2026-04-06 12:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:22:08.906265586 +0000 UTC m=+1507.894008452" watchObservedRunningTime="2026-04-06 12:22:08.920622908 +0000 UTC m=+1507.908365774" Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.753648 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.754035 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.754082 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.754893 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.754949 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103" gracePeriod=600 Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.895158 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerID="e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" exitCode=0 Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.895211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8fc7d56-62a8-407c-b535-d60c6e81feec","Type":"ContainerDied","Data":"e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6"} Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.898250 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103" exitCode=0 Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.898336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103"} Apr 06 12:22:09 crc kubenswrapper[4790]: I0406 12:22:09.898388 4790 scope.go:117] "RemoveContainer" containerID="cff05230f5546b5cc2b0d23c73b2cbf6f7260f5baa2011041047c1d9270e1dc6" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.041838 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.195490 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data\") pod \"c8fc7d56-62a8-407c-b535-d60c6e81feec\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.195581 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle\") pod \"c8fc7d56-62a8-407c-b535-d60c6e81feec\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.195702 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6x9\" (UniqueName: \"kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9\") pod \"c8fc7d56-62a8-407c-b535-d60c6e81feec\" (UID: \"c8fc7d56-62a8-407c-b535-d60c6e81feec\") " Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.204410 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9" (OuterVolumeSpecName: "kube-api-access-sd6x9") pod "c8fc7d56-62a8-407c-b535-d60c6e81feec" (UID: "c8fc7d56-62a8-407c-b535-d60c6e81feec"). InnerVolumeSpecName "kube-api-access-sd6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.229558 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data" (OuterVolumeSpecName: "config-data") pod "c8fc7d56-62a8-407c-b535-d60c6e81feec" (UID: "c8fc7d56-62a8-407c-b535-d60c6e81feec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.234092 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8fc7d56-62a8-407c-b535-d60c6e81feec" (UID: "c8fc7d56-62a8-407c-b535-d60c6e81feec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.297897 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.297925 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6x9\" (UniqueName: \"kubernetes.io/projected/c8fc7d56-62a8-407c-b535-d60c6e81feec-kube-api-access-sd6x9\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.297938 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fc7d56-62a8-407c-b535-d60c6e81feec-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.910065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c8fc7d56-62a8-407c-b535-d60c6e81feec","Type":"ContainerDied","Data":"e976caf28b34152ab37553377a44d9c3df52574980c8f158fab0e098371caaed"} Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.910143 4790 scope.go:117] "RemoveContainer" containerID="e26c6b50662b762b9ed32c0d48679d0213aa17324b3ff42f9151e93533835cd6" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.910142 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.913385 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435"} Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.967203 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.981104 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.992116 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:10 crc kubenswrapper[4790]: E0406 12:22:10.992578 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerName="nova-scheduler-scheduler" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.992592 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerName="nova-scheduler-scheduler" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.992786 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" containerName="nova-scheduler-scheduler" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.993488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:22:10 crc kubenswrapper[4790]: I0406 12:22:10.995582 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.003325 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.123292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.123580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-config-data\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.123655 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66jr6\" (UniqueName: \"kubernetes.io/projected/a73b707d-e57e-4f4c-a253-38e55128a1b2-kube-api-access-66jr6\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.225687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-config-data\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.225852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66jr6\" (UniqueName: \"kubernetes.io/projected/a73b707d-e57e-4f4c-a253-38e55128a1b2-kube-api-access-66jr6\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.225929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.231459 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-config-data\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.231672 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a73b707d-e57e-4f4c-a253-38e55128a1b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.247187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66jr6\" (UniqueName: \"kubernetes.io/projected/a73b707d-e57e-4f4c-a253-38e55128a1b2-kube-api-access-66jr6\") pod \"nova-scheduler-0\" (UID: \"a73b707d-e57e-4f4c-a253-38e55128a1b2\") " pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.340701 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.384542 4790 scope.go:117] "RemoveContainer" containerID="6bc7ebb49b088808465084372b5a7d7304c3f8ced1fab0a7415fd96383ec1aae" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.691221 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8fc7d56-62a8-407c-b535-d60c6e81feec" path="/var/lib/kubelet/pods/c8fc7d56-62a8-407c-b535-d60c6e81feec/volumes" Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.809451 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Apr 06 12:22:11 crc kubenswrapper[4790]: W0406 12:22:11.812578 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73b707d_e57e_4f4c_a253_38e55128a1b2.slice/crio-627277ec83f1cbccb121de970bbd76f0892425b397e42a8f72fc3f42377ec74b WatchSource:0}: Error finding container 627277ec83f1cbccb121de970bbd76f0892425b397e42a8f72fc3f42377ec74b: Status 404 returned error can't find the container with id 627277ec83f1cbccb121de970bbd76f0892425b397e42a8f72fc3f42377ec74b Apr 06 12:22:11 crc kubenswrapper[4790]: I0406 12:22:11.942967 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a73b707d-e57e-4f4c-a253-38e55128a1b2","Type":"ContainerStarted","Data":"627277ec83f1cbccb121de970bbd76f0892425b397e42a8f72fc3f42377ec74b"} Apr 06 12:22:12 crc kubenswrapper[4790]: I0406 12:22:12.967474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a73b707d-e57e-4f4c-a253-38e55128a1b2","Type":"ContainerStarted","Data":"fb741eb64b5fd3d52fdf5af2fbb504f045c2fa8ec62ad79d3402871a0e162d8d"} Apr 06 12:22:13 crc kubenswrapper[4790]: I0406 12:22:13.000366 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.000344054 podStartE2EDuration="3.000344054s" podCreationTimestamp="2026-04-06 12:22:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:22:12.994550007 +0000 UTC m=+1511.982292883" watchObservedRunningTime="2026-04-06 12:22:13.000344054 +0000 UTC m=+1511.988086920" Apr 06 12:22:16 crc kubenswrapper[4790]: I0406 12:22:16.341781 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Apr 06 12:22:17 crc kubenswrapper[4790]: I0406 12:22:17.301053 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:22:17 crc kubenswrapper[4790]: I0406 12:22:17.301111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Apr 06 12:22:17 crc kubenswrapper[4790]: I0406 12:22:17.308786 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 06 12:22:17 crc kubenswrapper[4790]: I0406 12:22:17.308854 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Apr 06 12:22:18 crc kubenswrapper[4790]: I0406 12:22:18.307423 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cd7a2b3-4c64-4d09-9865-cd55277fd369" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.249:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:18 crc kubenswrapper[4790]: I0406 12:22:18.326974 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cd7a2b3-4c64-4d09-9865-cd55277fd369" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.249:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:18 crc kubenswrapper[4790]: I0406 12:22:18.327240 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8fa33d66-ad99-4650-bc60-a97e16cbd064" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.250:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:18 crc kubenswrapper[4790]: I0406 12:22:18.327279 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8fa33d66-ad99-4650-bc60-a97e16cbd064" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.250:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Apr 06 12:22:20 crc kubenswrapper[4790]: I0406 12:22:20.231495 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Apr 06 12:22:21 crc kubenswrapper[4790]: I0406 12:22:21.341754 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Apr 06 12:22:21 crc kubenswrapper[4790]: I0406 12:22:21.373240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Apr 06 12:22:22 crc kubenswrapper[4790]: I0406 12:22:22.106569 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Apr 06 12:22:25 crc kubenswrapper[4790]: I0406 12:22:25.300132 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 06 12:22:25 crc kubenswrapper[4790]: I0406 12:22:25.300608 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Apr 06 12:22:25 crc kubenswrapper[4790]: I0406 12:22:25.309326 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 06 12:22:25 crc kubenswrapper[4790]: I0406 12:22:25.309736 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.311377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.316104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.319269 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.323145 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.325191 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 06 12:22:27 crc kubenswrapper[4790]: I0406 12:22:27.341359 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 06 12:22:28 crc kubenswrapper[4790]: I0406 12:22:28.144889 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Apr 06 12:22:28 crc kubenswrapper[4790]: I0406 12:22:28.153163 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Apr 06 12:22:36 crc kubenswrapper[4790]: I0406 12:22:36.273462 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:37 crc kubenswrapper[4790]: I0406 12:22:37.114602 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:39 crc kubenswrapper[4790]: I0406 12:22:39.427762 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="rabbitmq" containerID="cri-o://8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248" gracePeriod=604797 Apr 06 12:22:40 crc kubenswrapper[4790]: I0406 12:22:40.395260 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="rabbitmq" containerID="cri-o://4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb" gracePeriod=604797 Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.073199 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.221050 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.221119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.221933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.221985 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.222291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.222377 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.222772 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.222958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.222998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.223343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.223368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.223399 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.223436 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8kl\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl\") pod \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\" (UID: \"75bcb75a-939b-4fda-b4ed-a66707bb16d7\") " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.225364 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.226247 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.226267 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-plugins-conf\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.226276 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.226789 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.227304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.228356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info" (OuterVolumeSpecName: "pod-info") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.241377 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl" (OuterVolumeSpecName: "kube-api-access-rx8kl") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "kube-api-access-rx8kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.241545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.251285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data" (OuterVolumeSpecName: "config-data") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.277919 4790 generic.go:334] "Generic (PLEG): container finished" podID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerID="8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248" exitCode=0 Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.277973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerDied","Data":"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248"} Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.277983 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.278006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75bcb75a-939b-4fda-b4ed-a66707bb16d7","Type":"ContainerDied","Data":"41b5be6ba4b634574a3925e82cf926675350310d9e5119ffcf7f54a9441850f5"} Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.278028 4790 scope.go:117] "RemoveContainer" containerID="8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.315927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf" (OuterVolumeSpecName: "server-conf") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.333936 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75bcb75a-939b-4fda-b4ed-a66707bb16d7-pod-info\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.333990 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.334006 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8kl\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-kube-api-access-rx8kl\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.334020 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.334032 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.334044 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75bcb75a-939b-4fda-b4ed-a66707bb16d7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.334055 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75bcb75a-939b-4fda-b4ed-a66707bb16d7-server-conf\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.362333 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.366893 4790 scope.go:117] "RemoveContainer" containerID="89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.390921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "75bcb75a-939b-4fda-b4ed-a66707bb16d7" (UID: "75bcb75a-939b-4fda-b4ed-a66707bb16d7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.403426 4790 scope.go:117] "RemoveContainer" containerID="8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248" Apr 06 12:22:41 crc kubenswrapper[4790]: E0406 12:22:41.403968 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248\": container with ID starting with 8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248 not found: ID does not exist" containerID="8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.404020 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248"} err="failed to get container status \"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248\": rpc error: code = NotFound desc = could not find container \"8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248\": container with ID starting with 8013629b1001f9420c6bf4ffc54cd7f90424750be6e4cdd08476384803c1f248 not found: ID does not exist" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.404049 4790 scope.go:117] "RemoveContainer" containerID="89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d" Apr 06 12:22:41 crc kubenswrapper[4790]: E0406 12:22:41.404565 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d\": container with ID starting with 89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d not found: ID does not exist" containerID="89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.404609 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d"} err="failed to get container status \"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d\": rpc error: code = NotFound desc = could not find container \"89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d\": container with ID starting with 89e7c2013a112850fef605fc76ff072ff42360b8046fdd873c4a3e7e29e22b9d not found: ID does not exist" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.435432 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.435467 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75bcb75a-939b-4fda-b4ed-a66707bb16d7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.630352 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.656038 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.666646 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:41 crc kubenswrapper[4790]: E0406 12:22:41.667067 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="rabbitmq" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.667087 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="rabbitmq" Apr 06 12:22:41 crc kubenswrapper[4790]: E0406 12:22:41.667119 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="setup-container" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.667130 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="setup-container" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.667334 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" containerName="rabbitmq" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.668354 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675458 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675482 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675542 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675676 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675682 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6qf8" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.675952 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.676201 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.692040 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75bcb75a-939b-4fda-b4ed-a66707bb16d7" path="/var/lib/kubelet/pods/75bcb75a-939b-4fda-b4ed-a66707bb16d7/volumes" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.709650 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.741778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.741908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.741937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.741979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5c35395-30bf-42d7-89e4-d306b4e4cc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5c35395-30bf-42d7-89e4-d306b4e4cc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x22tv\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-kube-api-access-x22tv\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.742321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.844863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.844915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.844955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845009 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5c35395-30bf-42d7-89e4-d306b4e4cc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845164 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5c35395-30bf-42d7-89e4-d306b4e4cc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x22tv\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-kube-api-access-x22tv\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845240 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.845463 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.846527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.846956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.846985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.847557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.850946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d5c35395-30bf-42d7-89e4-d306b4e4cc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.851192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d5c35395-30bf-42d7-89e4-d306b4e4cc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.851916 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d5c35395-30bf-42d7-89e4-d306b4e4cc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.865338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x22tv\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-kube-api-access-x22tv\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.875568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.921799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d5c35395-30bf-42d7-89e4-d306b4e4cc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:41 crc kubenswrapper[4790]: I0406 12:22:41.932141 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d5c35395-30bf-42d7-89e4-d306b4e4cc37\") " pod="openstack/rabbitmq-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.027082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.053989 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2dwm\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151356 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151377 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151414 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151507 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.151594 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c8b8a855-8cd2-4a9a-b804-c78641506883\" (UID: \"c8b8a855-8cd2-4a9a-b804-c78641506883\") " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.153845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.153886 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.154570 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.156754 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.165035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info" (OuterVolumeSpecName: "pod-info") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.167489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.168435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm" (OuterVolumeSpecName: "kube-api-access-s2dwm") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "kube-api-access-s2dwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.175497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.201250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data" (OuterVolumeSpecName: "config-data") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253728 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf" (OuterVolumeSpecName: "server-conf") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253935 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253960 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-plugins-conf\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253969 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c8b8a855-8cd2-4a9a-b804-c78641506883-server-conf\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2dwm\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-kube-api-access-s2dwm\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253986 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c8b8a855-8cd2-4a9a-b804-c78641506883-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.253994 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c8b8a855-8cd2-4a9a-b804-c78641506883-pod-info\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.254003 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.254012 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.254020 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.254045 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.277214 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c8b8a855-8cd2-4a9a-b804-c78641506883" (UID: "c8b8a855-8cd2-4a9a-b804-c78641506883"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.288706 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.314111 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerID="4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb" exitCode=0 Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.314196 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.314219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerDied","Data":"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb"} Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.314436 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c8b8a855-8cd2-4a9a-b804-c78641506883","Type":"ContainerDied","Data":"234ca7ad56e805696dfb4d0628670f15c2961765c587a343becbd710eb66ddd6"} Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.314458 4790 scope.go:117] "RemoveContainer" containerID="4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.355816 4790 scope.go:117] "RemoveContainer" containerID="78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.356294 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.356372 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c8b8a855-8cd2-4a9a-b804-c78641506883-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.387989 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.402128 4790 scope.go:117] "RemoveContainer" containerID="4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb" Apr 06 12:22:42 crc kubenswrapper[4790]: E0406 12:22:42.406770 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb\": container with ID starting with 4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb not found: ID does not exist" containerID="4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.406842 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb"} err="failed to get container status \"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb\": rpc error: code = NotFound desc = could not find container \"4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb\": container with ID starting with 4bf49d8619032bb410034cb6212688a8dd2963d6cf0bf5adb9c5af6e7e03e9eb not found: ID does not exist" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.406876 4790 scope.go:117] "RemoveContainer" containerID="78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4" Apr 06 12:22:42 crc kubenswrapper[4790]: E0406 12:22:42.407710 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4\": container with ID starting with 78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4 not found: ID does not exist" containerID="78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.407759 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4"} err="failed to get container status \"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4\": rpc error: code = NotFound desc = could not find container \"78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4\": container with ID starting with 78b7293a7efb77c91c086543685b34cfc61985d3c200c40f264a9855e2d510f4 not found: ID does not exist" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.408019 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.429695 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:42 crc kubenswrapper[4790]: E0406 12:22:42.430155 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="setup-container" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.430170 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="setup-container" Apr 06 12:22:42 crc kubenswrapper[4790]: E0406 12:22:42.430196 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="rabbitmq" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.430202 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="rabbitmq" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.430372 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" containerName="rabbitmq" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.431430 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.439518 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.439563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.439571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.440141 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.440275 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f9sl4" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.440418 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.440519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.441514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8sw\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-kube-api-access-st8sw\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ecd623d8-83f3-46b0-b566-b9801c44dfc8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563545 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ecd623d8-83f3-46b0-b566-b9801c44dfc8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.563999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.564100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.564149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.614799 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666373 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ecd623d8-83f3-46b0-b566-b9801c44dfc8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8sw\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-kube-api-access-st8sw\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ecd623d8-83f3-46b0-b566-b9801c44dfc8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666583 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666666 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.666708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.667480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.668089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.668336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.668871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.669218 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ecd623d8-83f3-46b0-b566-b9801c44dfc8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.669297 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.672038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ecd623d8-83f3-46b0-b566-b9801c44dfc8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.676668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.676738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ecd623d8-83f3-46b0-b566-b9801c44dfc8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.677779 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.687179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8sw\" (UniqueName: \"kubernetes.io/projected/ecd623d8-83f3-46b0-b566-b9801c44dfc8-kube-api-access-st8sw\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.711866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ecd623d8-83f3-46b0-b566-b9801c44dfc8\") " pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:42 crc kubenswrapper[4790]: I0406 12:22:42.760167 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:22:43 crc kubenswrapper[4790]: I0406 12:22:43.276920 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Apr 06 12:22:43 crc kubenswrapper[4790]: W0406 12:22:43.284648 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd623d8_83f3_46b0_b566_b9801c44dfc8.slice/crio-0f133cfc8e61d4b829750823a72cd28680ad56334d254eafb31314d18663d96b WatchSource:0}: Error finding container 0f133cfc8e61d4b829750823a72cd28680ad56334d254eafb31314d18663d96b: Status 404 returned error can't find the container with id 0f133cfc8e61d4b829750823a72cd28680ad56334d254eafb31314d18663d96b Apr 06 12:22:43 crc kubenswrapper[4790]: I0406 12:22:43.326634 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5c35395-30bf-42d7-89e4-d306b4e4cc37","Type":"ContainerStarted","Data":"5c87a781703e4a900fd97b850e8db49e5c93bde4815da351f1db120272cd7dbf"} Apr 06 12:22:43 crc kubenswrapper[4790]: I0406 12:22:43.327845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ecd623d8-83f3-46b0-b566-b9801c44dfc8","Type":"ContainerStarted","Data":"0f133cfc8e61d4b829750823a72cd28680ad56334d254eafb31314d18663d96b"} Apr 06 12:22:43 crc kubenswrapper[4790]: I0406 12:22:43.687514 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b8a855-8cd2-4a9a-b804-c78641506883" path="/var/lib/kubelet/pods/c8b8a855-8cd2-4a9a-b804-c78641506883/volumes" Apr 06 12:22:45 crc kubenswrapper[4790]: I0406 12:22:45.349457 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5c35395-30bf-42d7-89e4-d306b4e4cc37","Type":"ContainerStarted","Data":"379e2b4c11338ea318f45eb865246a0d870887c8d231338b73efb2a9699ad3e8"} Apr 06 12:22:45 crc kubenswrapper[4790]: I0406 12:22:45.353867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ecd623d8-83f3-46b0-b566-b9801c44dfc8","Type":"ContainerStarted","Data":"f1e5323924411460f97acf14111e5dd16628db0a2c97371238050aa855b94351"} Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.863692 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f76f6d955-vfhkc"] Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.865903 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.867905 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.873582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f76f6d955-vfhkc"] Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.936668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.936741 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.936773 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.937123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.937255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.937345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl76\" (UniqueName: \"kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:50 crc kubenswrapper[4790]: I0406 12:22:50.937464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.014363 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f76f6d955-vfhkc"] Apr 06 12:22:51 crc kubenswrapper[4790]: E0406 12:22:51.015219 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-mxl76 openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" podUID="98c6754a-7f55-4fb6-9252-62cd9683c9d6" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.038933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039232 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.039344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl76\" (UniqueName: \"kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.040657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.040987 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.041348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.041696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.042058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.042304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.042524 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.042743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.076520 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl76\" (UniqueName: \"kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76\") pod \"dnsmasq-dns-f76f6d955-vfhkc\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.100810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141521 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141588 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplj9\" (UniqueName: \"kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.141659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.242942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplj9\" (UniqueName: \"kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.243222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244161 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.244619 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.265868 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplj9\" (UniqueName: \"kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9\") pod \"dnsmasq-dns-6d5db77d59-gpjb9\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.367020 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.425739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.507291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650240 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650447 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650496 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxl76\" (UniqueName: \"kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.650529 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config\") pod \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\" (UID: \"98c6754a-7f55-4fb6-9252-62cd9683c9d6\") " Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.651516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config" (OuterVolumeSpecName: "config") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.651582 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.651940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.652047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.652485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.652617 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.672753 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76" (OuterVolumeSpecName: "kube-api-access-mxl76") pod "98c6754a-7f55-4fb6-9252-62cd9683c9d6" (UID: "98c6754a-7f55-4fb6-9252-62cd9683c9d6"). InnerVolumeSpecName "kube-api-access-mxl76". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.753994 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754032 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754067 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754082 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754094 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754106 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxl76\" (UniqueName: \"kubernetes.io/projected/98c6754a-7f55-4fb6-9252-62cd9683c9d6-kube-api-access-mxl76\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.754118 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c6754a-7f55-4fb6-9252-62cd9683c9d6-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:22:51 crc kubenswrapper[4790]: I0406 12:22:51.849117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.437463 4790 generic.go:334] "Generic (PLEG): container finished" podID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerID="3e9c93db578ede254056a8e75530ce354c15eee08a8e3b3b0c647e37fa301810" exitCode=0 Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.437606 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" event={"ID":"a01db80b-09ab-4aad-b0c8-6c00e914bd30","Type":"ContainerDied","Data":"3e9c93db578ede254056a8e75530ce354c15eee08a8e3b3b0c647e37fa301810"} Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.437728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" event={"ID":"a01db80b-09ab-4aad-b0c8-6c00e914bd30","Type":"ContainerStarted","Data":"b78028e15031122c0e54b4c232ca580b93d9accf1e9977fdd0a68e52283b7155"} Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.438224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f76f6d955-vfhkc" Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.568363 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f76f6d955-vfhkc"] Apr 06 12:22:52 crc kubenswrapper[4790]: I0406 12:22:52.593568 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f76f6d955-vfhkc"] Apr 06 12:22:53 crc kubenswrapper[4790]: I0406 12:22:53.448336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" event={"ID":"a01db80b-09ab-4aad-b0c8-6c00e914bd30","Type":"ContainerStarted","Data":"e8382521ef0960d2c3e6e9e90b6da1a317368cb9ac7867c7a0eb63b45120b1a8"} Apr 06 12:22:53 crc kubenswrapper[4790]: I0406 12:22:53.449026 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:22:53 crc kubenswrapper[4790]: I0406 12:22:53.498501 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" podStartSLOduration=2.498480204 podStartE2EDuration="2.498480204s" podCreationTimestamp="2026-04-06 12:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:22:53.482960349 +0000 UTC m=+1552.470703205" watchObservedRunningTime="2026-04-06 12:22:53.498480204 +0000 UTC m=+1552.486223070" Apr 06 12:22:53 crc kubenswrapper[4790]: I0406 12:22:53.686704 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c6754a-7f55-4fb6-9252-62cd9683c9d6" path="/var/lib/kubelet/pods/98c6754a-7f55-4fb6-9252-62cd9683c9d6/volumes" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.369367 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.431818 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.534972 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="dnsmasq-dns" containerID="cri-o://749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0" gracePeriod=10 Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.629566 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff4b9c47-92mqr"] Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.631437 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.743869 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff4b9c47-92mqr"] Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-config\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793224 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-openstack-edpm-ipam\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-svc\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793368 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.793527 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p848\" (UniqueName: \"kubernetes.io/projected/13316504-6091-438b-8386-eb10fd6c7ce4-kube-api-access-9p848\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.897627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.897714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.897761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-config\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.898793 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.898807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-openstack-edpm-ipam\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.898886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-config\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.898934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-openstack-edpm-ipam\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.898990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-svc\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.899293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.899706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-dns-svc\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.899903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.899973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p848\" (UniqueName: \"kubernetes.io/projected/13316504-6091-438b-8386-eb10fd6c7ce4-kube-api-access-9p848\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.900539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13316504-6091-438b-8386-eb10fd6c7ce4-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:01 crc kubenswrapper[4790]: I0406 12:23:01.944597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p848\" (UniqueName: \"kubernetes.io/projected/13316504-6091-438b-8386-eb10fd6c7ce4-kube-api-access-9p848\") pod \"dnsmasq-dns-85ff4b9c47-92mqr\" (UID: \"13316504-6091-438b-8386-eb10fd6c7ce4\") " pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.041164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.190056 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.216409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.216689 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.313254 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.317806 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.318960 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.319171 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.319233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr7hm\" (UniqueName: \"kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.319291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb\") pod \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\" (UID: \"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6\") " Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.319740 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.319756 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.324006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm" (OuterVolumeSpecName: "kube-api-access-hr7hm") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "kube-api-access-hr7hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.375313 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.389233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config" (OuterVolumeSpecName: "config") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.390841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" (UID: "5046e8d2-1b4d-4243-aa21-7d442bd7a5b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.421653 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.421712 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr7hm\" (UniqueName: \"kubernetes.io/projected/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-kube-api-access-hr7hm\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.421727 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.421739 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.567549 4790 generic.go:334] "Generic (PLEG): container finished" podID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerID="749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0" exitCode=0 Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.568411 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.568448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" event={"ID":"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6","Type":"ContainerDied","Data":"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0"} Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.570483 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f587c5df-58dj2" event={"ID":"5046e8d2-1b4d-4243-aa21-7d442bd7a5b6","Type":"ContainerDied","Data":"c4d9fd32a4dcda9b291725318eee0caed5d53fc158463570f0b2102c2b33d41a"} Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.570512 4790 scope.go:117] "RemoveContainer" containerID="749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.591863 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff4b9c47-92mqr"] Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.614067 4790 scope.go:117] "RemoveContainer" containerID="0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.615320 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:23:02 crc kubenswrapper[4790]: W0406 12:23:02.619513 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13316504_6091_438b_8386_eb10fd6c7ce4.slice/crio-ca14044f5d2c33cdea67376893c3d291bca53d086dd20adbab423c13bfc94f7f WatchSource:0}: Error finding container ca14044f5d2c33cdea67376893c3d291bca53d086dd20adbab423c13bfc94f7f: Status 404 returned error can't find the container with id ca14044f5d2c33cdea67376893c3d291bca53d086dd20adbab423c13bfc94f7f Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.628126 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f587c5df-58dj2"] Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.636422 4790 scope.go:117] "RemoveContainer" containerID="749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0" Apr 06 12:23:02 crc kubenswrapper[4790]: E0406 12:23:02.638408 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0\": container with ID starting with 749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0 not found: ID does not exist" containerID="749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.638445 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0"} err="failed to get container status \"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0\": rpc error: code = NotFound desc = could not find container \"749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0\": container with ID starting with 749399c9102c9ee5e729b14de23d3ef968b649c59a77e08645c4bd8dca18a1d0 not found: ID does not exist" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.638474 4790 scope.go:117] "RemoveContainer" containerID="0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26" Apr 06 12:23:02 crc kubenswrapper[4790]: E0406 12:23:02.639140 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26\": container with ID starting with 0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26 not found: ID does not exist" containerID="0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26" Apr 06 12:23:02 crc kubenswrapper[4790]: I0406 12:23:02.639162 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26"} err="failed to get container status \"0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26\": rpc error: code = NotFound desc = could not find container \"0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26\": container with ID starting with 0729db78e9eb9eede9d8e89e0d3b3f9008e36c5a9e4e2cb5dc9cce29c8e28c26 not found: ID does not exist" Apr 06 12:23:03 crc kubenswrapper[4790]: I0406 12:23:03.580851 4790 generic.go:334] "Generic (PLEG): container finished" podID="13316504-6091-438b-8386-eb10fd6c7ce4" containerID="5d5c839e37f2db9822b25fba75e3336b985f06351282cedcc1b9a28702fbfaea" exitCode=0 Apr 06 12:23:03 crc kubenswrapper[4790]: I0406 12:23:03.580955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" event={"ID":"13316504-6091-438b-8386-eb10fd6c7ce4","Type":"ContainerDied","Data":"5d5c839e37f2db9822b25fba75e3336b985f06351282cedcc1b9a28702fbfaea"} Apr 06 12:23:03 crc kubenswrapper[4790]: I0406 12:23:03.581148 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" event={"ID":"13316504-6091-438b-8386-eb10fd6c7ce4","Type":"ContainerStarted","Data":"ca14044f5d2c33cdea67376893c3d291bca53d086dd20adbab423c13bfc94f7f"} Apr 06 12:23:03 crc kubenswrapper[4790]: I0406 12:23:03.685716 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" path="/var/lib/kubelet/pods/5046e8d2-1b4d-4243-aa21-7d442bd7a5b6/volumes" Apr 06 12:23:04 crc kubenswrapper[4790]: I0406 12:23:04.593000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" event={"ID":"13316504-6091-438b-8386-eb10fd6c7ce4","Type":"ContainerStarted","Data":"3731d6dae013b3ccfbc973cb78c5ba91bece254257a0b1f4fe1f230e98d3ccc0"} Apr 06 12:23:04 crc kubenswrapper[4790]: I0406 12:23:04.593259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:04 crc kubenswrapper[4790]: I0406 12:23:04.628890 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" podStartSLOduration=3.628817831 podStartE2EDuration="3.628817831s" podCreationTimestamp="2026-04-06 12:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:23:04.612108539 +0000 UTC m=+1563.599851395" watchObservedRunningTime="2026-04-06 12:23:04.628817831 +0000 UTC m=+1563.616560717" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.043159 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff4b9c47-92mqr" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.176438 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.176712 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="dnsmasq-dns" containerID="cri-o://e8382521ef0960d2c3e6e9e90b6da1a317368cb9ac7867c7a0eb63b45120b1a8" gracePeriod=10 Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.677859 4790 generic.go:334] "Generic (PLEG): container finished" podID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerID="e8382521ef0960d2c3e6e9e90b6da1a317368cb9ac7867c7a0eb63b45120b1a8" exitCode=0 Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.678119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" event={"ID":"a01db80b-09ab-4aad-b0c8-6c00e914bd30","Type":"ContainerDied","Data":"e8382521ef0960d2c3e6e9e90b6da1a317368cb9ac7867c7a0eb63b45120b1a8"} Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.678137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" event={"ID":"a01db80b-09ab-4aad-b0c8-6c00e914bd30","Type":"ContainerDied","Data":"b78028e15031122c0e54b4c232ca580b93d9accf1e9977fdd0a68e52283b7155"} Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.678146 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78028e15031122c0e54b4c232ca580b93d9accf1e9977fdd0a68e52283b7155" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.714875 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825458 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hplj9\" (UniqueName: \"kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.825882 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb\") pod \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\" (UID: \"a01db80b-09ab-4aad-b0c8-6c00e914bd30\") " Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.836131 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9" (OuterVolumeSpecName: "kube-api-access-hplj9") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "kube-api-access-hplj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.882513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.882567 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.884082 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config" (OuterVolumeSpecName: "config") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.884505 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.903383 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.916126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a01db80b-09ab-4aad-b0c8-6c00e914bd30" (UID: "a01db80b-09ab-4aad-b0c8-6c00e914bd30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.928938 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929038 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hplj9\" (UniqueName: \"kubernetes.io/projected/a01db80b-09ab-4aad-b0c8-6c00e914bd30-kube-api-access-hplj9\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929106 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-svc\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929163 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929225 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929285 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:12 crc kubenswrapper[4790]: I0406 12:23:12.929343 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01db80b-09ab-4aad-b0c8-6c00e914bd30-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:13 crc kubenswrapper[4790]: I0406 12:23:13.687141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5db77d59-gpjb9" Apr 06 12:23:13 crc kubenswrapper[4790]: I0406 12:23:13.721912 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:23:13 crc kubenswrapper[4790]: I0406 12:23:13.730974 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5db77d59-gpjb9"] Apr 06 12:23:15 crc kubenswrapper[4790]: I0406 12:23:15.697100 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" path="/var/lib/kubelet/pods/a01db80b-09ab-4aad-b0c8-6c00e914bd30/volumes" Apr 06 12:23:16 crc kubenswrapper[4790]: I0406 12:23:16.729663 4790 generic.go:334] "Generic (PLEG): container finished" podID="d5c35395-30bf-42d7-89e4-d306b4e4cc37" containerID="379e2b4c11338ea318f45eb865246a0d870887c8d231338b73efb2a9699ad3e8" exitCode=0 Apr 06 12:23:16 crc kubenswrapper[4790]: I0406 12:23:16.729741 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5c35395-30bf-42d7-89e4-d306b4e4cc37","Type":"ContainerDied","Data":"379e2b4c11338ea318f45eb865246a0d870887c8d231338b73efb2a9699ad3e8"} Apr 06 12:23:17 crc kubenswrapper[4790]: I0406 12:23:17.741568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d5c35395-30bf-42d7-89e4-d306b4e4cc37","Type":"ContainerStarted","Data":"cdf2b08b4692c4ba77bbac3825ab41925ab62cf26c7bfcb17661c7d235a2355e"} Apr 06 12:23:17 crc kubenswrapper[4790]: I0406 12:23:17.742285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Apr 06 12:23:17 crc kubenswrapper[4790]: I0406 12:23:17.743403 4790 generic.go:334] "Generic (PLEG): container finished" podID="ecd623d8-83f3-46b0-b566-b9801c44dfc8" containerID="f1e5323924411460f97acf14111e5dd16628db0a2c97371238050aa855b94351" exitCode=0 Apr 06 12:23:17 crc kubenswrapper[4790]: I0406 12:23:17.743446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ecd623d8-83f3-46b0-b566-b9801c44dfc8","Type":"ContainerDied","Data":"f1e5323924411460f97acf14111e5dd16628db0a2c97371238050aa855b94351"} Apr 06 12:23:17 crc kubenswrapper[4790]: I0406 12:23:17.781646 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.781620274 podStartE2EDuration="36.781620274s" podCreationTimestamp="2026-04-06 12:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:23:17.767165255 +0000 UTC m=+1576.754908121" watchObservedRunningTime="2026-04-06 12:23:17.781620274 +0000 UTC m=+1576.769363140" Apr 06 12:23:18 crc kubenswrapper[4790]: I0406 12:23:18.754971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ecd623d8-83f3-46b0-b566-b9801c44dfc8","Type":"ContainerStarted","Data":"5cabd036123090c97954212e8a758a966bf8f29dc4a3b0bde8fd2351b3dee15c"} Apr 06 12:23:18 crc kubenswrapper[4790]: I0406 12:23:18.755816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:23:18 crc kubenswrapper[4790]: I0406 12:23:18.788511 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.788491127 podStartE2EDuration="36.788491127s" podCreationTimestamp="2026-04-06 12:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:23:18.779447307 +0000 UTC m=+1577.767190213" watchObservedRunningTime="2026-04-06 12:23:18.788491127 +0000 UTC m=+1577.776233993" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.157104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9"] Apr 06 12:23:30 crc kubenswrapper[4790]: E0406 12:23:30.158090 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158106 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: E0406 12:23:30.158120 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="init" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158127 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="init" Apr 06 12:23:30 crc kubenswrapper[4790]: E0406 12:23:30.158164 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158174 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: E0406 12:23:30.158199 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="init" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158205 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="init" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158379 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5046e8d2-1b4d-4243-aa21-7d442bd7a5b6" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.158393 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01db80b-09ab-4aad-b0c8-6c00e914bd30" containerName="dnsmasq-dns" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.159098 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.164192 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.164290 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.164460 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.164940 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.176994 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9"] Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.267623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.267720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7x5h\" (UniqueName: \"kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.267786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.268156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.370033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.370165 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.370210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7x5h\" (UniqueName: \"kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.370252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.377798 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.379243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.387487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.390882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7x5h\" (UniqueName: \"kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:30 crc kubenswrapper[4790]: I0406 12:23:30.489572 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:31 crc kubenswrapper[4790]: I0406 12:23:31.070983 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9"] Apr 06 12:23:31 crc kubenswrapper[4790]: I0406 12:23:31.880753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" event={"ID":"e671a5e8-99cd-4a96-a26f-93ff0eb8980c","Type":"ContainerStarted","Data":"b41576ebf89d4d41600c8b349b4578897f8e1e7ba5a1f6323909e083c0e3cf6b"} Apr 06 12:23:32 crc kubenswrapper[4790]: I0406 12:23:32.030902 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Apr 06 12:23:32 crc kubenswrapper[4790]: I0406 12:23:32.763023 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Apr 06 12:23:41 crc kubenswrapper[4790]: I0406 12:23:41.991232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" event={"ID":"e671a5e8-99cd-4a96-a26f-93ff0eb8980c","Type":"ContainerStarted","Data":"60d40b375c11cae2da7d8d3d0cc2d84c562ea8075236454761a73fe4cf7a75e8"} Apr 06 12:23:42 crc kubenswrapper[4790]: I0406 12:23:42.013365 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" podStartSLOduration=1.733023873 podStartE2EDuration="12.013347845s" podCreationTimestamp="2026-04-06 12:23:30 +0000 UTC" firstStartedPulling="2026-04-06 12:23:31.091536151 +0000 UTC m=+1590.079279017" lastFinishedPulling="2026-04-06 12:23:41.371860113 +0000 UTC m=+1600.359602989" observedRunningTime="2026-04-06 12:23:42.008758649 +0000 UTC m=+1600.996501545" watchObservedRunningTime="2026-04-06 12:23:42.013347845 +0000 UTC m=+1601.001090711" Apr 06 12:23:52 crc kubenswrapper[4790]: I0406 12:23:52.100721 4790 generic.go:334] "Generic (PLEG): container finished" podID="e671a5e8-99cd-4a96-a26f-93ff0eb8980c" containerID="60d40b375c11cae2da7d8d3d0cc2d84c562ea8075236454761a73fe4cf7a75e8" exitCode=0 Apr 06 12:23:52 crc kubenswrapper[4790]: I0406 12:23:52.100860 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" event={"ID":"e671a5e8-99cd-4a96-a26f-93ff0eb8980c","Type":"ContainerDied","Data":"60d40b375c11cae2da7d8d3d0cc2d84c562ea8075236454761a73fe4cf7a75e8"} Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.531786 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.651539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle\") pod \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.651592 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory\") pod \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.651612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7x5h\" (UniqueName: \"kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h\") pod \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.651650 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam\") pod \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\" (UID: \"e671a5e8-99cd-4a96-a26f-93ff0eb8980c\") " Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.656988 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e671a5e8-99cd-4a96-a26f-93ff0eb8980c" (UID: "e671a5e8-99cd-4a96-a26f-93ff0eb8980c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.657118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h" (OuterVolumeSpecName: "kube-api-access-g7x5h") pod "e671a5e8-99cd-4a96-a26f-93ff0eb8980c" (UID: "e671a5e8-99cd-4a96-a26f-93ff0eb8980c"). InnerVolumeSpecName "kube-api-access-g7x5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.680145 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e671a5e8-99cd-4a96-a26f-93ff0eb8980c" (UID: "e671a5e8-99cd-4a96-a26f-93ff0eb8980c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.704550 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory" (OuterVolumeSpecName: "inventory") pod "e671a5e8-99cd-4a96-a26f-93ff0eb8980c" (UID: "e671a5e8-99cd-4a96-a26f-93ff0eb8980c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.755147 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.755200 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.755344 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7x5h\" (UniqueName: \"kubernetes.io/projected/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-kube-api-access-g7x5h\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:53 crc kubenswrapper[4790]: I0406 12:23:53.755604 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671a5e8-99cd-4a96-a26f-93ff0eb8980c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.122484 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" event={"ID":"e671a5e8-99cd-4a96-a26f-93ff0eb8980c","Type":"ContainerDied","Data":"b41576ebf89d4d41600c8b349b4578897f8e1e7ba5a1f6323909e083c0e3cf6b"} Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.122525 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41576ebf89d4d41600c8b349b4578897f8e1e7ba5a1f6323909e083c0e3cf6b" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.122576 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.207334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth"] Apr 06 12:23:54 crc kubenswrapper[4790]: E0406 12:23:54.207919 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e671a5e8-99cd-4a96-a26f-93ff0eb8980c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.207941 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e671a5e8-99cd-4a96-a26f-93ff0eb8980c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.208184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e671a5e8-99cd-4a96-a26f-93ff0eb8980c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.208894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.210881 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.211269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.211267 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.214174 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.243304 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth"] Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.277727 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.277899 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.277928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72mc9\" (UniqueName: \"kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.379590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.379748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.379783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72mc9\" (UniqueName: \"kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.384526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.384618 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.407735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72mc9\" (UniqueName: \"kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-kwcth\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:54 crc kubenswrapper[4790]: I0406 12:23:54.550701 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:55 crc kubenswrapper[4790]: I0406 12:23:55.093159 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth"] Apr 06 12:23:55 crc kubenswrapper[4790]: I0406 12:23:55.134132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" event={"ID":"c0c26165-8e10-4607-9cce-f36ec74bdc85","Type":"ContainerStarted","Data":"e42aa8fabce8af181b63a5de4a1f462fde2bc82d3b5179b2de3665dcdb4e0f20"} Apr 06 12:23:56 crc kubenswrapper[4790]: I0406 12:23:56.145646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" event={"ID":"c0c26165-8e10-4607-9cce-f36ec74bdc85","Type":"ContainerStarted","Data":"30b90900848c4b16e7bb246b4099d2c859e18b94b0648d2563a4a668e9c058c5"} Apr 06 12:23:58 crc kubenswrapper[4790]: I0406 12:23:58.167781 4790 generic.go:334] "Generic (PLEG): container finished" podID="c0c26165-8e10-4607-9cce-f36ec74bdc85" containerID="30b90900848c4b16e7bb246b4099d2c859e18b94b0648d2563a4a668e9c058c5" exitCode=0 Apr 06 12:23:58 crc kubenswrapper[4790]: I0406 12:23:58.167854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" event={"ID":"c0c26165-8e10-4607-9cce-f36ec74bdc85","Type":"ContainerDied","Data":"30b90900848c4b16e7bb246b4099d2c859e18b94b0648d2563a4a668e9c058c5"} Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.599521 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.705761 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam\") pod \"c0c26165-8e10-4607-9cce-f36ec74bdc85\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.705872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72mc9\" (UniqueName: \"kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9\") pod \"c0c26165-8e10-4607-9cce-f36ec74bdc85\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.706043 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory\") pod \"c0c26165-8e10-4607-9cce-f36ec74bdc85\" (UID: \"c0c26165-8e10-4607-9cce-f36ec74bdc85\") " Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.711455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9" (OuterVolumeSpecName: "kube-api-access-72mc9") pod "c0c26165-8e10-4607-9cce-f36ec74bdc85" (UID: "c0c26165-8e10-4607-9cce-f36ec74bdc85"). InnerVolumeSpecName "kube-api-access-72mc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.738349 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0c26165-8e10-4607-9cce-f36ec74bdc85" (UID: "c0c26165-8e10-4607-9cce-f36ec74bdc85"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.742660 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory" (OuterVolumeSpecName: "inventory") pod "c0c26165-8e10-4607-9cce-f36ec74bdc85" (UID: "c0c26165-8e10-4607-9cce-f36ec74bdc85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.816079 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.816180 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0c26165-8e10-4607-9cce-f36ec74bdc85-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:23:59 crc kubenswrapper[4790]: I0406 12:23:59.816209 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72mc9\" (UniqueName: \"kubernetes.io/projected/c0c26165-8e10-4607-9cce-f36ec74bdc85-kube-api-access-72mc9\") on node \"crc\" DevicePath \"\"" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.146511 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591304-7wmpw"] Apr 06 12:24:00 crc kubenswrapper[4790]: E0406 12:24:00.147040 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c26165-8e10-4607-9cce-f36ec74bdc85" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.147065 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c26165-8e10-4607-9cce-f36ec74bdc85" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.147375 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c26165-8e10-4607-9cce-f36ec74bdc85" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.148269 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.150914 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.150932 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.150975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.157289 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591304-7wmpw"] Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.198321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" event={"ID":"c0c26165-8e10-4607-9cce-f36ec74bdc85","Type":"ContainerDied","Data":"e42aa8fabce8af181b63a5de4a1f462fde2bc82d3b5179b2de3665dcdb4e0f20"} Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.198366 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e42aa8fabce8af181b63a5de4a1f462fde2bc82d3b5179b2de3665dcdb4e0f20" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.198367 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-kwcth" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.225614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7799\" (UniqueName: \"kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799\") pod \"auto-csr-approver-29591304-7wmpw\" (UID: \"776dcf99-f1e6-4524-9f96-d4a99a6967bb\") " pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.270410 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n"] Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.271809 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.273665 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.273928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.274661 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.274873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.284393 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n"] Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.327507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7799\" (UniqueName: \"kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799\") pod \"auto-csr-approver-29591304-7wmpw\" (UID: \"776dcf99-f1e6-4524-9f96-d4a99a6967bb\") " pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.344247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7799\" (UniqueName: \"kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799\") pod \"auto-csr-approver-29591304-7wmpw\" (UID: \"776dcf99-f1e6-4524-9f96-d4a99a6967bb\") " pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.429109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdnw\" (UniqueName: \"kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.429435 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.429500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.429607 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.468302 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.531493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.531654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.531844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.532016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdnw\" (UniqueName: \"kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.544515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.545175 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.553055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.564359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdnw\" (UniqueName: \"kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.597790 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:24:00 crc kubenswrapper[4790]: I0406 12:24:00.959457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591304-7wmpw"] Apr 06 12:24:01 crc kubenswrapper[4790]: I0406 12:24:01.184598 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n"] Apr 06 12:24:01 crc kubenswrapper[4790]: W0406 12:24:01.187720 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34fd07c3_5b1c_440c_a0d0_3d9423f40cc8.slice/crio-a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363 WatchSource:0}: Error finding container a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363: Status 404 returned error can't find the container with id a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363 Apr 06 12:24:01 crc kubenswrapper[4790]: I0406 12:24:01.212761 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" event={"ID":"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8","Type":"ContainerStarted","Data":"a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363"} Apr 06 12:24:01 crc kubenswrapper[4790]: I0406 12:24:01.214628 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" event={"ID":"776dcf99-f1e6-4524-9f96-d4a99a6967bb","Type":"ContainerStarted","Data":"d2274791c2914dff361fabd63d395cb59cbe07770222023d10cc716109d65a4f"} Apr 06 12:24:01 crc kubenswrapper[4790]: I0406 12:24:01.852477 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:24:02 crc kubenswrapper[4790]: I0406 12:24:02.231208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" event={"ID":"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8","Type":"ContainerStarted","Data":"0a3aafadda97b8c231f530041b9de3687ef54f1a17273ddcf2da40157df7d700"} Apr 06 12:24:02 crc kubenswrapper[4790]: I0406 12:24:02.233360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" event={"ID":"776dcf99-f1e6-4524-9f96-d4a99a6967bb","Type":"ContainerStarted","Data":"4d8b66177ca0666df9c8c5263de19046d9bc7f91b1cb13fa39dde7fbc12caa30"} Apr 06 12:24:02 crc kubenswrapper[4790]: I0406 12:24:02.258283 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" podStartSLOduration=1.599464665 podStartE2EDuration="2.258258712s" podCreationTimestamp="2026-04-06 12:24:00 +0000 UTC" firstStartedPulling="2026-04-06 12:24:01.190476885 +0000 UTC m=+1620.178219751" lastFinishedPulling="2026-04-06 12:24:01.849270932 +0000 UTC m=+1620.837013798" observedRunningTime="2026-04-06 12:24:02.251099317 +0000 UTC m=+1621.238842183" watchObservedRunningTime="2026-04-06 12:24:02.258258712 +0000 UTC m=+1621.246001588" Apr 06 12:24:02 crc kubenswrapper[4790]: I0406 12:24:02.282288 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" podStartSLOduration=1.467556106 podStartE2EDuration="2.282265873s" podCreationTimestamp="2026-04-06 12:24:00 +0000 UTC" firstStartedPulling="2026-04-06 12:24:00.963629572 +0000 UTC m=+1619.951372438" lastFinishedPulling="2026-04-06 12:24:01.778339319 +0000 UTC m=+1620.766082205" observedRunningTime="2026-04-06 12:24:02.27131963 +0000 UTC m=+1621.259062496" watchObservedRunningTime="2026-04-06 12:24:02.282265873 +0000 UTC m=+1621.270008739" Apr 06 12:24:03 crc kubenswrapper[4790]: I0406 12:24:03.246415 4790 generic.go:334] "Generic (PLEG): container finished" podID="776dcf99-f1e6-4524-9f96-d4a99a6967bb" containerID="4d8b66177ca0666df9c8c5263de19046d9bc7f91b1cb13fa39dde7fbc12caa30" exitCode=0 Apr 06 12:24:03 crc kubenswrapper[4790]: I0406 12:24:03.246471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" event={"ID":"776dcf99-f1e6-4524-9f96-d4a99a6967bb","Type":"ContainerDied","Data":"4d8b66177ca0666df9c8c5263de19046d9bc7f91b1cb13fa39dde7fbc12caa30"} Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.661329 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.723737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7799\" (UniqueName: \"kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799\") pod \"776dcf99-f1e6-4524-9f96-d4a99a6967bb\" (UID: \"776dcf99-f1e6-4524-9f96-d4a99a6967bb\") " Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.733012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799" (OuterVolumeSpecName: "kube-api-access-v7799") pod "776dcf99-f1e6-4524-9f96-d4a99a6967bb" (UID: "776dcf99-f1e6-4524-9f96-d4a99a6967bb"). InnerVolumeSpecName "kube-api-access-v7799". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.826975 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7799\" (UniqueName: \"kubernetes.io/projected/776dcf99-f1e6-4524-9f96-d4a99a6967bb-kube-api-access-v7799\") on node \"crc\" DevicePath \"\"" Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.848170 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591298-z2tn8"] Apr 06 12:24:04 crc kubenswrapper[4790]: I0406 12:24:04.857088 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591298-z2tn8"] Apr 06 12:24:05 crc kubenswrapper[4790]: I0406 12:24:05.273566 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" event={"ID":"776dcf99-f1e6-4524-9f96-d4a99a6967bb","Type":"ContainerDied","Data":"d2274791c2914dff361fabd63d395cb59cbe07770222023d10cc716109d65a4f"} Apr 06 12:24:05 crc kubenswrapper[4790]: I0406 12:24:05.273900 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2274791c2914dff361fabd63d395cb59cbe07770222023d10cc716109d65a4f" Apr 06 12:24:05 crc kubenswrapper[4790]: I0406 12:24:05.273621 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591304-7wmpw" Apr 06 12:24:05 crc kubenswrapper[4790]: I0406 12:24:05.691734 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4acf1b39-821a-435f-9943-d6f12210cc0a" path="/var/lib/kubelet/pods/4acf1b39-821a-435f-9943-d6f12210cc0a/volumes" Apr 06 12:24:11 crc kubenswrapper[4790]: I0406 12:24:11.925090 4790 scope.go:117] "RemoveContainer" containerID="86c7a01185401a5d1fb88cae44bb2c787903cba75db32d75487e4db5db97a2aa" Apr 06 12:24:11 crc kubenswrapper[4790]: I0406 12:24:11.995049 4790 scope.go:117] "RemoveContainer" containerID="81621dd202759600beb132e9f91a12681c557b57a6805803182b32c7d7ce37a6" Apr 06 12:24:12 crc kubenswrapper[4790]: I0406 12:24:12.034307 4790 scope.go:117] "RemoveContainer" containerID="5169965d83e273e3915dad78fa6fb59167540e1e30650a2047213c2ac2a15aff" Apr 06 12:24:12 crc kubenswrapper[4790]: I0406 12:24:12.072662 4790 scope.go:117] "RemoveContainer" containerID="3b31fe5a4c7ccb0ce7387369edd8c2a5c32c202903afc461cc1a675c556ddf89" Apr 06 12:24:39 crc kubenswrapper[4790]: I0406 12:24:39.753177 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:24:39 crc kubenswrapper[4790]: I0406 12:24:39.753746 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:25:09 crc kubenswrapper[4790]: I0406 12:25:09.753115 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:25:09 crc kubenswrapper[4790]: I0406 12:25:09.753639 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:25:12 crc kubenswrapper[4790]: I0406 12:25:12.231503 4790 scope.go:117] "RemoveContainer" containerID="62d024d1e1a7544111c81c038d7f9c09e5dcaa97d0b467a41e26e87be1ea8fb2" Apr 06 12:25:39 crc kubenswrapper[4790]: I0406 12:25:39.753679 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:25:39 crc kubenswrapper[4790]: I0406 12:25:39.754244 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:25:39 crc kubenswrapper[4790]: I0406 12:25:39.754290 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:25:39 crc kubenswrapper[4790]: I0406 12:25:39.755301 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:25:39 crc kubenswrapper[4790]: I0406 12:25:39.755368 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" gracePeriod=600 Apr 06 12:25:39 crc kubenswrapper[4790]: E0406 12:25:39.877271 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:25:40 crc kubenswrapper[4790]: I0406 12:25:40.390251 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" exitCode=0 Apr 06 12:25:40 crc kubenswrapper[4790]: I0406 12:25:40.390286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435"} Apr 06 12:25:40 crc kubenswrapper[4790]: I0406 12:25:40.390363 4790 scope.go:117] "RemoveContainer" containerID="b2372d9f6a301cd0dfdb4fc8fc1d01c6ccf4e782967660b9a51b1c3386dff103" Apr 06 12:25:40 crc kubenswrapper[4790]: I0406 12:25:40.391390 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:25:40 crc kubenswrapper[4790]: E0406 12:25:40.391923 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:25:51 crc kubenswrapper[4790]: I0406 12:25:51.690312 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:25:51 crc kubenswrapper[4790]: E0406 12:25:51.691249 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.144821 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591306-khl4s"] Apr 06 12:26:00 crc kubenswrapper[4790]: E0406 12:26:00.146098 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="776dcf99-f1e6-4524-9f96-d4a99a6967bb" containerName="oc" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.146122 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="776dcf99-f1e6-4524-9f96-d4a99a6967bb" containerName="oc" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.146454 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="776dcf99-f1e6-4524-9f96-d4a99a6967bb" containerName="oc" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.147360 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.150388 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.151288 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.153215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.154009 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591306-khl4s"] Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.343208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg62n\" (UniqueName: \"kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n\") pod \"auto-csr-approver-29591306-khl4s\" (UID: \"5245510f-0d96-4f12-95ea-d65141bdd2e0\") " pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.445256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg62n\" (UniqueName: \"kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n\") pod \"auto-csr-approver-29591306-khl4s\" (UID: \"5245510f-0d96-4f12-95ea-d65141bdd2e0\") " pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.479699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg62n\" (UniqueName: \"kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n\") pod \"auto-csr-approver-29591306-khl4s\" (UID: \"5245510f-0d96-4f12-95ea-d65141bdd2e0\") " pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:00 crc kubenswrapper[4790]: I0406 12:26:00.765160 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:01 crc kubenswrapper[4790]: I0406 12:26:01.238403 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591306-khl4s"] Apr 06 12:26:01 crc kubenswrapper[4790]: I0406 12:26:01.642465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591306-khl4s" event={"ID":"5245510f-0d96-4f12-95ea-d65141bdd2e0","Type":"ContainerStarted","Data":"1347c5d4086078c479b4efba06305a6162be115486eb1c3b78b3a7d7fffb25bd"} Apr 06 12:26:02 crc kubenswrapper[4790]: I0406 12:26:02.659996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591306-khl4s" event={"ID":"5245510f-0d96-4f12-95ea-d65141bdd2e0","Type":"ContainerStarted","Data":"d72a2a5c5edc7178eab28e4530d84fed627921c7f7e2ec5a88e9fe84c51aaa26"} Apr 06 12:26:02 crc kubenswrapper[4790]: I0406 12:26:02.685666 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591306-khl4s" podStartSLOduration=1.594825174 podStartE2EDuration="2.685645516s" podCreationTimestamp="2026-04-06 12:26:00 +0000 UTC" firstStartedPulling="2026-04-06 12:26:01.246539727 +0000 UTC m=+1740.234282593" lastFinishedPulling="2026-04-06 12:26:02.337360059 +0000 UTC m=+1741.325102935" observedRunningTime="2026-04-06 12:26:02.683347646 +0000 UTC m=+1741.671090512" watchObservedRunningTime="2026-04-06 12:26:02.685645516 +0000 UTC m=+1741.673388382" Apr 06 12:26:03 crc kubenswrapper[4790]: I0406 12:26:03.673252 4790 generic.go:334] "Generic (PLEG): container finished" podID="5245510f-0d96-4f12-95ea-d65141bdd2e0" containerID="d72a2a5c5edc7178eab28e4530d84fed627921c7f7e2ec5a88e9fe84c51aaa26" exitCode=0 Apr 06 12:26:03 crc kubenswrapper[4790]: I0406 12:26:03.673409 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591306-khl4s" event={"ID":"5245510f-0d96-4f12-95ea-d65141bdd2e0","Type":"ContainerDied","Data":"d72a2a5c5edc7178eab28e4530d84fed627921c7f7e2ec5a88e9fe84c51aaa26"} Apr 06 12:26:04 crc kubenswrapper[4790]: I0406 12:26:04.675284 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:26:04 crc kubenswrapper[4790]: E0406 12:26:04.675750 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.038473 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.238863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg62n\" (UniqueName: \"kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n\") pod \"5245510f-0d96-4f12-95ea-d65141bdd2e0\" (UID: \"5245510f-0d96-4f12-95ea-d65141bdd2e0\") " Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.244104 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n" (OuterVolumeSpecName: "kube-api-access-qg62n") pod "5245510f-0d96-4f12-95ea-d65141bdd2e0" (UID: "5245510f-0d96-4f12-95ea-d65141bdd2e0"). InnerVolumeSpecName "kube-api-access-qg62n". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.341433 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg62n\" (UniqueName: \"kubernetes.io/projected/5245510f-0d96-4f12-95ea-d65141bdd2e0-kube-api-access-qg62n\") on node \"crc\" DevicePath \"\"" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.713816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591306-khl4s" event={"ID":"5245510f-0d96-4f12-95ea-d65141bdd2e0","Type":"ContainerDied","Data":"1347c5d4086078c479b4efba06305a6162be115486eb1c3b78b3a7d7fffb25bd"} Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.713877 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1347c5d4086078c479b4efba06305a6162be115486eb1c3b78b3a7d7fffb25bd" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.717328 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591306-khl4s" Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.755202 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591300-28l55"] Apr 06 12:26:05 crc kubenswrapper[4790]: I0406 12:26:05.765846 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591300-28l55"] Apr 06 12:26:07 crc kubenswrapper[4790]: I0406 12:26:07.685032 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccb2d6f-cb57-40af-96b7-fd082306b586" path="/var/lib/kubelet/pods/bccb2d6f-cb57-40af-96b7-fd082306b586/volumes" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.340976 4790 scope.go:117] "RemoveContainer" containerID="c33123007803af9785014085486b050c6938d4cb14f7f3e48b2cbbee482bf980" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.369647 4790 scope.go:117] "RemoveContainer" containerID="0c1d1921a02cfda917d2ebcf1c37daad191edffc8e98fe97eae941e5275bc450" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.404977 4790 scope.go:117] "RemoveContainer" containerID="a4f850013477db35327f02957caab550b30a29b39cdbcffecb80b476af2c1540" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.429146 4790 scope.go:117] "RemoveContainer" containerID="f135f668806711d8774c07c8dfa5aa505ac2a38daa00191c7e57cccdc4082867" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.474215 4790 scope.go:117] "RemoveContainer" containerID="a2df1d9f6e637ee85ef4c1aca1d98b6533d74c7db6017194bb8d2aff5f696cd4" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.521609 4790 scope.go:117] "RemoveContainer" containerID="c1290455893fd32e8bad27fb8c94a1dbde6e53b2f454831fce3d068048166ded" Apr 06 12:26:12 crc kubenswrapper[4790]: I0406 12:26:12.547108 4790 scope.go:117] "RemoveContainer" containerID="b14a9d2e3ec91ebd1630ec0c769361b2e4018a864b3249c674142b8785d39eeb" Apr 06 12:26:19 crc kubenswrapper[4790]: I0406 12:26:19.675814 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:26:19 crc kubenswrapper[4790]: E0406 12:26:19.676696 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:31 crc kubenswrapper[4790]: I0406 12:26:31.684755 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:26:31 crc kubenswrapper[4790]: E0406 12:26:31.685702 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:46 crc kubenswrapper[4790]: I0406 12:26:46.675770 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:26:46 crc kubenswrapper[4790]: E0406 12:26:46.677108 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:57 crc kubenswrapper[4790]: I0406 12:26:57.298804 4790 generic.go:334] "Generic (PLEG): container finished" podID="34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" containerID="0a3aafadda97b8c231f530041b9de3687ef54f1a17273ddcf2da40157df7d700" exitCode=0 Apr 06 12:26:57 crc kubenswrapper[4790]: I0406 12:26:57.298906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" event={"ID":"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8","Type":"ContainerDied","Data":"0a3aafadda97b8c231f530041b9de3687ef54f1a17273ddcf2da40157df7d700"} Apr 06 12:26:57 crc kubenswrapper[4790]: I0406 12:26:57.676864 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:26:57 crc kubenswrapper[4790]: E0406 12:26:57.677542 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.744541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.927083 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam\") pod \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.927145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory\") pod \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.927232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle\") pod \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.927463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgdnw\" (UniqueName: \"kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw\") pod \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\" (UID: \"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8\") " Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.932624 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw" (OuterVolumeSpecName: "kube-api-access-pgdnw") pod "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" (UID: "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8"). InnerVolumeSpecName "kube-api-access-pgdnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.933181 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" (UID: "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.961568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" (UID: "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:26:58 crc kubenswrapper[4790]: I0406 12:26:58.967389 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory" (OuterVolumeSpecName: "inventory") pod "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" (UID: "34fd07c3-5b1c-440c-a0d0-3d9423f40cc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.029596 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgdnw\" (UniqueName: \"kubernetes.io/projected/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-kube-api-access-pgdnw\") on node \"crc\" DevicePath \"\"" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.029637 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.029647 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.029656 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fd07c3-5b1c-440c-a0d0-3d9423f40cc8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.319333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" event={"ID":"34fd07c3-5b1c-440c-a0d0-3d9423f40cc8","Type":"ContainerDied","Data":"a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363"} Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.319400 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30c6da6c3b349e894022f8a429afa8ab5bd9f4660c8c953baa6efd36adba363" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.319456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.412749 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt"] Apr 06 12:26:59 crc kubenswrapper[4790]: E0406 12:26:59.413433 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5245510f-0d96-4f12-95ea-d65141bdd2e0" containerName="oc" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.413525 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5245510f-0d96-4f12-95ea-d65141bdd2e0" containerName="oc" Apr 06 12:26:59 crc kubenswrapper[4790]: E0406 12:26:59.413591 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.413650 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.413898 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fd07c3-5b1c-440c-a0d0-3d9423f40cc8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.413988 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5245510f-0d96-4f12-95ea-d65141bdd2e0" containerName="oc" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.414662 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.416674 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.416869 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.416987 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.417058 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.433233 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt"] Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.546151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.546234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.546383 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbj5d\" (UniqueName: \"kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.649022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbj5d\" (UniqueName: \"kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.649738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.651123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.657083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.658289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.666705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbj5d\" (UniqueName: \"kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:26:59 crc kubenswrapper[4790]: I0406 12:26:59.763265 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:27:00 crc kubenswrapper[4790]: I0406 12:27:00.319408 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt"] Apr 06 12:27:00 crc kubenswrapper[4790]: W0406 12:27:00.333864 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf73a2e40_f5e3_4e0e_9244_c076b36e911e.slice/crio-17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac WatchSource:0}: Error finding container 17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac: Status 404 returned error can't find the container with id 17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac Apr 06 12:27:00 crc kubenswrapper[4790]: I0406 12:27:00.342515 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:27:01 crc kubenswrapper[4790]: I0406 12:27:01.345008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" event={"ID":"f73a2e40-f5e3-4e0e-9244-c076b36e911e","Type":"ContainerStarted","Data":"b162133338f8871916bbb67ef68576f29e43618d074f6e586f0d6d489a22ea47"} Apr 06 12:27:01 crc kubenswrapper[4790]: I0406 12:27:01.345332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" event={"ID":"f73a2e40-f5e3-4e0e-9244-c076b36e911e","Type":"ContainerStarted","Data":"17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac"} Apr 06 12:27:01 crc kubenswrapper[4790]: I0406 12:27:01.373982 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" podStartSLOduration=1.9479193320000001 podStartE2EDuration="2.373962991s" podCreationTimestamp="2026-04-06 12:26:59 +0000 UTC" firstStartedPulling="2026-04-06 12:27:00.342244817 +0000 UTC m=+1799.329987693" lastFinishedPulling="2026-04-06 12:27:00.768288486 +0000 UTC m=+1799.756031352" observedRunningTime="2026-04-06 12:27:01.363656817 +0000 UTC m=+1800.351399783" watchObservedRunningTime="2026-04-06 12:27:01.373962991 +0000 UTC m=+1800.361705857" Apr 06 12:27:10 crc kubenswrapper[4790]: I0406 12:27:10.675867 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:27:10 crc kubenswrapper[4790]: E0406 12:27:10.676648 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:27:12 crc kubenswrapper[4790]: I0406 12:27:12.661097 4790 scope.go:117] "RemoveContainer" containerID="d76422ecb6805bd87d908e40e40cfe7b2744f8572a4d769b14ef99e2e09179c7" Apr 06 12:27:12 crc kubenswrapper[4790]: I0406 12:27:12.703717 4790 scope.go:117] "RemoveContainer" containerID="6fe4c3dd1697ab7ae8735a11fb5921822fb1f95bd1d1b99c8ff677bb779e145a" Apr 06 12:27:12 crc kubenswrapper[4790]: I0406 12:27:12.727107 4790 scope.go:117] "RemoveContainer" containerID="571317242b82d964f97bc0861956ca6e7b71a5f8a05f70dfdc5becb2f16cf83e" Apr 06 12:27:12 crc kubenswrapper[4790]: I0406 12:27:12.754861 4790 scope.go:117] "RemoveContainer" containerID="af26e769ba3a86afe83a7de61d1ef7c66c0dd157c41639008c693bec62c0085b" Apr 06 12:27:12 crc kubenswrapper[4790]: I0406 12:27:12.780048 4790 scope.go:117] "RemoveContainer" containerID="d7ebac945324e1d3970bd05a7428bf42aa33864abe32dc70e1fd5a5d20fb496c" Apr 06 12:27:22 crc kubenswrapper[4790]: I0406 12:27:22.675248 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:27:22 crc kubenswrapper[4790]: E0406 12:27:22.676011 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.059667 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4rwbs"] Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.069853 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a74a-account-create-update-ldtlg"] Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.078971 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a74a-account-create-update-ldtlg"] Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.087750 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4rwbs"] Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.694108 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="052ec8fa-ee10-4d35-8dee-a61dc66d2352" path="/var/lib/kubelet/pods/052ec8fa-ee10-4d35-8dee-a61dc66d2352/volumes" Apr 06 12:27:23 crc kubenswrapper[4790]: I0406 12:27:23.696637 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76ef053-b579-43cb-a526-549fca65c4ba" path="/var/lib/kubelet/pods/e76ef053-b579-43cb-a526-549fca65c4ba/volumes" Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.063298 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ae49-account-create-update-nwkqx"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.080689 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fkm99"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.094395 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-aea6-account-create-update-vpgm4"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.104801 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-l7zf9"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.114141 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-aea6-account-create-update-vpgm4"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.175382 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fkm99"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.184418 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ae49-account-create-update-nwkqx"] Apr 06 12:27:26 crc kubenswrapper[4790]: I0406 12:27:26.193693 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-l7zf9"] Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.037626 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-zlwz2"] Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.047689 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-42cd-account-create-update-9r9j2"] Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.057022 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-zlwz2"] Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.066300 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-42cd-account-create-update-9r9j2"] Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.689116 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328ef813-cbdd-4e37-901c-2c998a9d7edd" path="/var/lib/kubelet/pods/328ef813-cbdd-4e37-901c-2c998a9d7edd/volumes" Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.690347 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3604131a-98c3-46f0-9add-fd3e919f18c1" path="/var/lib/kubelet/pods/3604131a-98c3-46f0-9add-fd3e919f18c1/volumes" Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.691639 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e3e70d-1789-45a0-84f8-1423e049abf1" path="/var/lib/kubelet/pods/99e3e70d-1789-45a0-84f8-1423e049abf1/volumes" Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.692621 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25bee7c-6b29-4d5d-856a-0174919c831c" path="/var/lib/kubelet/pods/e25bee7c-6b29-4d5d-856a-0174919c831c/volumes" Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.695194 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef90943c-ee17-4ba6-8047-93614a407a86" path="/var/lib/kubelet/pods/ef90943c-ee17-4ba6-8047-93614a407a86/volumes" Apr 06 12:27:27 crc kubenswrapper[4790]: I0406 12:27:27.696080 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c11a7e-145d-4f36-bb18-5ba87172fb2a" path="/var/lib/kubelet/pods/f4c11a7e-145d-4f36-bb18-5ba87172fb2a/volumes" Apr 06 12:27:37 crc kubenswrapper[4790]: I0406 12:27:37.675852 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:27:37 crc kubenswrapper[4790]: E0406 12:27:37.676550 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:27:44 crc kubenswrapper[4790]: I0406 12:27:44.033165 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zddlm"] Apr 06 12:27:44 crc kubenswrapper[4790]: I0406 12:27:44.043595 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zddlm"] Apr 06 12:27:45 crc kubenswrapper[4790]: I0406 12:27:45.688344 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3" path="/var/lib/kubelet/pods/0f93ab4b-192e-49b1-a4fa-8da9a0ec95e3/volumes" Apr 06 12:27:50 crc kubenswrapper[4790]: I0406 12:27:50.046162 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-fk556"] Apr 06 12:27:50 crc kubenswrapper[4790]: I0406 12:27:50.058873 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-fk556"] Apr 06 12:27:50 crc kubenswrapper[4790]: I0406 12:27:50.675248 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:27:50 crc kubenswrapper[4790]: E0406 12:27:50.675461 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:27:51 crc kubenswrapper[4790]: I0406 12:27:51.694142 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060385bd-e2c7-44f6-b575-3d353e949d85" path="/var/lib/kubelet/pods/060385bd-e2c7-44f6-b575-3d353e949d85/volumes" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.154298 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591308-sl5rd"] Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.156014 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.158907 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.158980 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.160715 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.170308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591308-sl5rd"] Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.227265 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxx4\" (UniqueName: \"kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4\") pod \"auto-csr-approver-29591308-sl5rd\" (UID: \"38044cee-b6ff-4431-9681-cd498da4823c\") " pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.329339 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxx4\" (UniqueName: \"kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4\") pod \"auto-csr-approver-29591308-sl5rd\" (UID: \"38044cee-b6ff-4431-9681-cd498da4823c\") " pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.346934 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxx4\" (UniqueName: \"kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4\") pod \"auto-csr-approver-29591308-sl5rd\" (UID: \"38044cee-b6ff-4431-9681-cd498da4823c\") " pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.477541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:00 crc kubenswrapper[4790]: I0406 12:28:00.950891 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591308-sl5rd"] Apr 06 12:28:01 crc kubenswrapper[4790]: I0406 12:28:01.028538 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" event={"ID":"38044cee-b6ff-4431-9681-cd498da4823c","Type":"ContainerStarted","Data":"f4dc3ce47cdc29a8d959c50d1ac1ae480a01174ffb4767c9c7ced4c75aef0710"} Apr 06 12:28:03 crc kubenswrapper[4790]: I0406 12:28:03.051227 4790 generic.go:334] "Generic (PLEG): container finished" podID="38044cee-b6ff-4431-9681-cd498da4823c" containerID="7150faec12f09400984e5d5cd9c429cd2d07623f925eb434687a8b511868e3dc" exitCode=0 Apr 06 12:28:03 crc kubenswrapper[4790]: I0406 12:28:03.051276 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" event={"ID":"38044cee-b6ff-4431-9681-cd498da4823c","Type":"ContainerDied","Data":"7150faec12f09400984e5d5cd9c429cd2d07623f925eb434687a8b511868e3dc"} Apr 06 12:28:03 crc kubenswrapper[4790]: I0406 12:28:03.676255 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:28:03 crc kubenswrapper[4790]: E0406 12:28:03.676865 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:28:04 crc kubenswrapper[4790]: I0406 12:28:04.558074 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:04 crc kubenswrapper[4790]: I0406 12:28:04.614937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfxx4\" (UniqueName: \"kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4\") pod \"38044cee-b6ff-4431-9681-cd498da4823c\" (UID: \"38044cee-b6ff-4431-9681-cd498da4823c\") " Apr 06 12:28:04 crc kubenswrapper[4790]: I0406 12:28:04.620144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4" (OuterVolumeSpecName: "kube-api-access-zfxx4") pod "38044cee-b6ff-4431-9681-cd498da4823c" (UID: "38044cee-b6ff-4431-9681-cd498da4823c"). InnerVolumeSpecName "kube-api-access-zfxx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:28:04 crc kubenswrapper[4790]: I0406 12:28:04.717909 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfxx4\" (UniqueName: \"kubernetes.io/projected/38044cee-b6ff-4431-9681-cd498da4823c-kube-api-access-zfxx4\") on node \"crc\" DevicePath \"\"" Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.074570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" event={"ID":"38044cee-b6ff-4431-9681-cd498da4823c","Type":"ContainerDied","Data":"f4dc3ce47cdc29a8d959c50d1ac1ae480a01174ffb4767c9c7ced4c75aef0710"} Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.074930 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dc3ce47cdc29a8d959c50d1ac1ae480a01174ffb4767c9c7ced4c75aef0710" Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.074655 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591308-sl5rd" Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.619636 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591302-pkz6k"] Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.629329 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591302-pkz6k"] Apr 06 12:28:05 crc kubenswrapper[4790]: I0406 12:28:05.687475 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fcc986-c077-4dee-a97a-7c3a92bd31d5" path="/var/lib/kubelet/pods/36fcc986-c077-4dee-a97a-7c3a92bd31d5/volumes" Apr 06 12:28:12 crc kubenswrapper[4790]: I0406 12:28:12.859628 4790 scope.go:117] "RemoveContainer" containerID="bfb799b45231431ff9a81c2a29edf9f651b6745af01f5934392bb91a557ec943" Apr 06 12:28:12 crc kubenswrapper[4790]: I0406 12:28:12.885628 4790 scope.go:117] "RemoveContainer" containerID="a414b99455b1f8d7772f38f8647dd46f0eda75c50716a8e6288769a3b661ef8a" Apr 06 12:28:12 crc kubenswrapper[4790]: I0406 12:28:12.945038 4790 scope.go:117] "RemoveContainer" containerID="978fbdeddd74a97d8aa9eea5a0461790857816b474e9043e247635dcefda85fc" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.005598 4790 scope.go:117] "RemoveContainer" containerID="2e1e340d49ce6e2f30c4ef4429e90a42952595608a29426837d51f7ea7512c06" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.051849 4790 scope.go:117] "RemoveContainer" containerID="1b0ee3303c81cc98414b8eada8756466caa1eca161596257d2b7f4690bc602ee" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.097715 4790 scope.go:117] "RemoveContainer" containerID="9efbae1034966c4a857f5f4f8ed6c4e69e76fd91badd0dedaa8e60d4f1915b5e" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.158661 4790 scope.go:117] "RemoveContainer" containerID="fe0b0ca31b874d5a16ba65269452bc82afb541e36cfc3f9f898d1a8ce6161413" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.175815 4790 scope.go:117] "RemoveContainer" containerID="e971f694ca3b2feff568fc417528c6ab6663185c98517301e6b3f35116226764" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.213318 4790 scope.go:117] "RemoveContainer" containerID="0b97254baba42cea65cd23e9e21110d1eb6b67d0e186093af5fbe57e0de17c2b" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.251945 4790 scope.go:117] "RemoveContainer" containerID="3e116411d90b29e20115218f9adcb2ca8c89d186dd31869fba2958425700d995" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.275081 4790 scope.go:117] "RemoveContainer" containerID="22553c5d47b4b299883bcd9557efa9e0b15016e7dcac44635dc86036de097a54" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.314770 4790 scope.go:117] "RemoveContainer" containerID="ef9b6ef49dafe6fcec6a49f5ad2034f91b59c5620dce412a00768b3463a443fd" Apr 06 12:28:13 crc kubenswrapper[4790]: I0406 12:28:13.335479 4790 scope.go:117] "RemoveContainer" containerID="0ac3e2f64247693905cc500679c7bb654011e539138df8eea905e2487a90383e" Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.029588 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f124-account-create-update-rflf8"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.042646 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f124-account-create-update-rflf8"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.052811 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ffac-account-create-update-gbz2f"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.062394 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bpckw"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.070804 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5xzzk"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.080052 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bpckw"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.091091 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ffac-account-create-update-gbz2f"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.100891 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5xzzk"] Apr 06 12:28:14 crc kubenswrapper[4790]: I0406 12:28:14.676258 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:28:14 crc kubenswrapper[4790]: E0406 12:28:14.676765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:28:15 crc kubenswrapper[4790]: I0406 12:28:15.694438 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0e7386-a123-4a3c-a175-f4a89ab43a27" path="/var/lib/kubelet/pods/1b0e7386-a123-4a3c-a175-f4a89ab43a27/volumes" Apr 06 12:28:15 crc kubenswrapper[4790]: I0406 12:28:15.695206 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef1835f-4003-4fe1-9157-aac7e54d5f94" path="/var/lib/kubelet/pods/1ef1835f-4003-4fe1-9157-aac7e54d5f94/volumes" Apr 06 12:28:15 crc kubenswrapper[4790]: I0406 12:28:15.705225 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59a84b5-d0bb-4d73-ada1-e7982fd86f50" path="/var/lib/kubelet/pods/d59a84b5-d0bb-4d73-ada1-e7982fd86f50/volumes" Apr 06 12:28:15 crc kubenswrapper[4790]: I0406 12:28:15.705853 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61172bd-20f5-49e9-886b-01c8cf8ce7cc" path="/var/lib/kubelet/pods/e61172bd-20f5-49e9-886b-01c8cf8ce7cc/volumes" Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.040924 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jwff6"] Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.052070 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-j4vzr"] Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.066015 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-j4vzr"] Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.075438 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jwff6"] Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.685783 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c696c2-cc67-4dce-9b9a-5c5ec2c14f01" path="/var/lib/kubelet/pods/68c696c2-cc67-4dce-9b9a-5c5ec2c14f01/volumes" Apr 06 12:28:19 crc kubenswrapper[4790]: I0406 12:28:19.686379 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d55899-25c2-467d-ab1b-202029b76a86" path="/var/lib/kubelet/pods/85d55899-25c2-467d-ab1b-202029b76a86/volumes" Apr 06 12:28:20 crc kubenswrapper[4790]: I0406 12:28:20.040941 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8bc9-account-create-update-dm9bc"] Apr 06 12:28:20 crc kubenswrapper[4790]: I0406 12:28:20.056777 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8bc9-account-create-update-dm9bc"] Apr 06 12:28:21 crc kubenswrapper[4790]: I0406 12:28:21.691460 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41bab77-7a21-4962-bd9b-82fa33b47aaa" path="/var/lib/kubelet/pods/d41bab77-7a21-4962-bd9b-82fa33b47aaa/volumes" Apr 06 12:28:29 crc kubenswrapper[4790]: I0406 12:28:29.676077 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:28:29 crc kubenswrapper[4790]: E0406 12:28:29.676898 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:28:31 crc kubenswrapper[4790]: I0406 12:28:31.370912 4790 generic.go:334] "Generic (PLEG): container finished" podID="f73a2e40-f5e3-4e0e-9244-c076b36e911e" containerID="b162133338f8871916bbb67ef68576f29e43618d074f6e586f0d6d489a22ea47" exitCode=0 Apr 06 12:28:31 crc kubenswrapper[4790]: I0406 12:28:31.371066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" event={"ID":"f73a2e40-f5e3-4e0e-9244-c076b36e911e","Type":"ContainerDied","Data":"b162133338f8871916bbb67ef68576f29e43618d074f6e586f0d6d489a22ea47"} Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.803337 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.909891 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbj5d\" (UniqueName: \"kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d\") pod \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.910207 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam\") pod \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.910253 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory\") pod \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\" (UID: \"f73a2e40-f5e3-4e0e-9244-c076b36e911e\") " Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.915621 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d" (OuterVolumeSpecName: "kube-api-access-xbj5d") pod "f73a2e40-f5e3-4e0e-9244-c076b36e911e" (UID: "f73a2e40-f5e3-4e0e-9244-c076b36e911e"). InnerVolumeSpecName "kube-api-access-xbj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.939012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory" (OuterVolumeSpecName: "inventory") pod "f73a2e40-f5e3-4e0e-9244-c076b36e911e" (UID: "f73a2e40-f5e3-4e0e-9244-c076b36e911e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:28:32 crc kubenswrapper[4790]: I0406 12:28:32.939317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f73a2e40-f5e3-4e0e-9244-c076b36e911e" (UID: "f73a2e40-f5e3-4e0e-9244-c076b36e911e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.012132 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.012176 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbj5d\" (UniqueName: \"kubernetes.io/projected/f73a2e40-f5e3-4e0e-9244-c076b36e911e-kube-api-access-xbj5d\") on node \"crc\" DevicePath \"\"" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.012191 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f73a2e40-f5e3-4e0e-9244-c076b36e911e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.428197 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" event={"ID":"f73a2e40-f5e3-4e0e-9244-c076b36e911e","Type":"ContainerDied","Data":"17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac"} Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.428546 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17028253a1cd95e69fef6f5f95a97b40e0d4f7199de7d5a9a8fe7c68d07fd4ac" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.428299 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.513521 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp"] Apr 06 12:28:33 crc kubenswrapper[4790]: E0406 12:28:33.514009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f73a2e40-f5e3-4e0e-9244-c076b36e911e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.514030 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f73a2e40-f5e3-4e0e-9244-c076b36e911e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 06 12:28:33 crc kubenswrapper[4790]: E0406 12:28:33.514044 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38044cee-b6ff-4431-9681-cd498da4823c" containerName="oc" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.514050 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="38044cee-b6ff-4431-9681-cd498da4823c" containerName="oc" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.514229 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f73a2e40-f5e3-4e0e-9244-c076b36e911e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.514246 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="38044cee-b6ff-4431-9681-cd498da4823c" containerName="oc" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.514934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.517453 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.518091 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.518207 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.518785 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.532523 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp"] Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.623324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.623409 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.623505 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxl8p\" (UniqueName: \"kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.725695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.726142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxl8p\" (UniqueName: \"kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.727058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.731712 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.744500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.749423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxl8p\" (UniqueName: \"kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:33 crc kubenswrapper[4790]: I0406 12:28:33.833448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:28:34 crc kubenswrapper[4790]: I0406 12:28:34.401057 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp"] Apr 06 12:28:34 crc kubenswrapper[4790]: I0406 12:28:34.438637 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" event={"ID":"4e61f8b4-0263-4224-a5e0-b34740fbca06","Type":"ContainerStarted","Data":"a35c3f2d5bd148152760c86a8ee64f3619bc6a3fcd13f74ec8a00759ca624700"} Apr 06 12:28:35 crc kubenswrapper[4790]: I0406 12:28:35.453018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" event={"ID":"4e61f8b4-0263-4224-a5e0-b34740fbca06","Type":"ContainerStarted","Data":"76df1a244dd7538cc888522bad5a05470e8a0796ac9a77d033fbc7b7f541bea7"} Apr 06 12:28:35 crc kubenswrapper[4790]: I0406 12:28:35.472250 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" podStartSLOduration=1.97932895 podStartE2EDuration="2.47223049s" podCreationTimestamp="2026-04-06 12:28:33 +0000 UTC" firstStartedPulling="2026-04-06 12:28:34.409458944 +0000 UTC m=+1893.397201810" lastFinishedPulling="2026-04-06 12:28:34.902360484 +0000 UTC m=+1893.890103350" observedRunningTime="2026-04-06 12:28:35.470313213 +0000 UTC m=+1894.458056079" watchObservedRunningTime="2026-04-06 12:28:35.47223049 +0000 UTC m=+1894.459973356" Apr 06 12:28:43 crc kubenswrapper[4790]: I0406 12:28:43.027353 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-8r6lc"] Apr 06 12:28:43 crc kubenswrapper[4790]: I0406 12:28:43.037516 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-8r6lc"] Apr 06 12:28:43 crc kubenswrapper[4790]: I0406 12:28:43.688408 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc30661-4ea6-4218-b75d-b59f473d41bc" path="/var/lib/kubelet/pods/ccc30661-4ea6-4218-b75d-b59f473d41bc/volumes" Apr 06 12:28:44 crc kubenswrapper[4790]: I0406 12:28:44.675986 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:28:44 crc kubenswrapper[4790]: E0406 12:28:44.676328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:28:52 crc kubenswrapper[4790]: I0406 12:28:52.057792 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-h5v54"] Apr 06 12:28:52 crc kubenswrapper[4790]: I0406 12:28:52.072790 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-h5v54"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.050034 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wl5rj"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.067811 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-r77xt"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.104524 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-42m6m"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.119267 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wl5rj"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.159272 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-r77xt"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.178918 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-42m6m"] Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.696995 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="570ffea2-75f1-4831-98d3-50b826b1e37d" path="/var/lib/kubelet/pods/570ffea2-75f1-4831-98d3-50b826b1e37d/volumes" Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.698254 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60672a42-148c-45e2-99a8-da1ebec40dbb" path="/var/lib/kubelet/pods/60672a42-148c-45e2-99a8-da1ebec40dbb/volumes" Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.699375 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c585de-2005-495a-a987-8cfe70dc8793" path="/var/lib/kubelet/pods/e4c585de-2005-495a-a987-8cfe70dc8793/volumes" Apr 06 12:28:53 crc kubenswrapper[4790]: I0406 12:28:53.701699 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e752bf76-0eff-4559-b599-6ab462cea81e" path="/var/lib/kubelet/pods/e752bf76-0eff-4559-b599-6ab462cea81e/volumes" Apr 06 12:28:58 crc kubenswrapper[4790]: I0406 12:28:58.675618 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:28:58 crc kubenswrapper[4790]: E0406 12:28:58.676374 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:29:11 crc kubenswrapper[4790]: I0406 12:29:11.050849 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6crt4"] Apr 06 12:29:11 crc kubenswrapper[4790]: I0406 12:29:11.065989 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6crt4"] Apr 06 12:29:11 crc kubenswrapper[4790]: I0406 12:29:11.683479 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:29:11 crc kubenswrapper[4790]: E0406 12:29:11.684457 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:29:11 crc kubenswrapper[4790]: I0406 12:29:11.689058 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25adb66a-8db5-48e1-b05c-526008a22e4f" path="/var/lib/kubelet/pods/25adb66a-8db5-48e1-b05c-526008a22e4f/volumes" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.591538 4790 scope.go:117] "RemoveContainer" containerID="e8382521ef0960d2c3e6e9e90b6da1a317368cb9ac7867c7a0eb63b45120b1a8" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.626122 4790 scope.go:117] "RemoveContainer" containerID="903969f63c571ecb0cce7af8395dd4793943db9b210217883734bb9e5176f6d9" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.661522 4790 scope.go:117] "RemoveContainer" containerID="331a4e9a44a7941c08218957f7d03252d32130bb800e78c1f49d02d5deaae5dd" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.695518 4790 scope.go:117] "RemoveContainer" containerID="34a822eed379db6c614f2b17c23ca0e4ef2502736e08159981bce01aef8fcceb" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.756712 4790 scope.go:117] "RemoveContainer" containerID="3d10f0528a4c4d40d78c00caf97f968f7339063019b97f60497a9d2aae1a4e5f" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.799571 4790 scope.go:117] "RemoveContainer" containerID="9640f14755782c1f3b2eea665acc3ab90b0cc9a5365e019256d29ad7b00fae78" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.867671 4790 scope.go:117] "RemoveContainer" containerID="8b7f10527fc797bce82e9bc8de68b45de4b7c4dce8cee9f414f59c8d0e5b3f6d" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.903902 4790 scope.go:117] "RemoveContainer" containerID="3e9c93db578ede254056a8e75530ce354c15eee08a8e3b3b0c647e37fa301810" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.929095 4790 scope.go:117] "RemoveContainer" containerID="c473e7b7241a513ca98ed37ddc30240de019d042f4f717ca35bc4e4c51833b0a" Apr 06 12:29:13 crc kubenswrapper[4790]: I0406 12:29:13.960046 4790 scope.go:117] "RemoveContainer" containerID="4816d53015881df073e4dc66cced460501d1ad4d92c83193e5131e1ec4d6aab1" Apr 06 12:29:14 crc kubenswrapper[4790]: I0406 12:29:14.005456 4790 scope.go:117] "RemoveContainer" containerID="d13452abb359ca2338c1acedc317f38a7cf7097878ef34b2eaa3d8722957c828" Apr 06 12:29:14 crc kubenswrapper[4790]: I0406 12:29:14.067187 4790 scope.go:117] "RemoveContainer" containerID="9e2b79d59e01f7d6364ee63f9b1187bfcc162e73e96dcd12c19b27b2a6bdc0cd" Apr 06 12:29:14 crc kubenswrapper[4790]: I0406 12:29:14.094853 4790 scope.go:117] "RemoveContainer" containerID="5806040a576ff1fbeb726114ee548b4e7f93b9bb2d086feba510bba5c41bbea5" Apr 06 12:29:14 crc kubenswrapper[4790]: I0406 12:29:14.144314 4790 scope.go:117] "RemoveContainer" containerID="6ba15c503dadda2929be103c5c54a33f67c17e6c345a48acb0408358340d4058" Apr 06 12:29:14 crc kubenswrapper[4790]: I0406 12:29:14.169323 4790 scope.go:117] "RemoveContainer" containerID="0587a7d5856dbe85e53580dd53de6f5db18363f7057a531df9afa2d63d5134a8" Apr 06 12:29:25 crc kubenswrapper[4790]: I0406 12:29:25.675634 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:29:25 crc kubenswrapper[4790]: E0406 12:29:25.676571 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:29:40 crc kubenswrapper[4790]: I0406 12:29:40.675507 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:29:40 crc kubenswrapper[4790]: E0406 12:29:40.676053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.047061 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-v24cr"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.064771 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-tvmxz"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.075635 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rnrfj"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.089368 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-v24cr"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.098874 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eb17-account-create-update-64z8s"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.106357 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9d65-account-create-update-dkjm7"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.114094 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4df9-account-create-update-wlsw6"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.123356 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rnrfj"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.132550 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-tvmxz"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.140723 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eb17-account-create-update-64z8s"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.149407 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4df9-account-create-update-wlsw6"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.159726 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9d65-account-create-update-dkjm7"] Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.692788 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24cee348-21ee-454e-942e-c689c059effa" path="/var/lib/kubelet/pods/24cee348-21ee-454e-942e-c689c059effa/volumes" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.693789 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a419b6-12dc-427a-9481-c48f7e602d54" path="/var/lib/kubelet/pods/32a419b6-12dc-427a-9481-c48f7e602d54/volumes" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.695177 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45658e80-85a4-4557-bc5d-85d86bb92f7f" path="/var/lib/kubelet/pods/45658e80-85a4-4557-bc5d-85d86bb92f7f/volumes" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.695924 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982d3ca0-0055-4a1a-85ae-533ca695f992" path="/var/lib/kubelet/pods/982d3ca0-0055-4a1a-85ae-533ca695f992/volumes" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.697706 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8" path="/var/lib/kubelet/pods/df1c4265-b7eb-49d1-a71f-c4e5f0bcbfe8/volumes" Apr 06 12:29:43 crc kubenswrapper[4790]: I0406 12:29:43.698852 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f84efa-2483-4a30-9297-f60a74e88c75" path="/var/lib/kubelet/pods/f7f84efa-2483-4a30-9297-f60a74e88c75/volumes" Apr 06 12:29:44 crc kubenswrapper[4790]: I0406 12:29:44.155436 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e61f8b4-0263-4224-a5e0-b34740fbca06" containerID="76df1a244dd7538cc888522bad5a05470e8a0796ac9a77d033fbc7b7f541bea7" exitCode=0 Apr 06 12:29:44 crc kubenswrapper[4790]: I0406 12:29:44.155495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" event={"ID":"4e61f8b4-0263-4224-a5e0-b34740fbca06","Type":"ContainerDied","Data":"76df1a244dd7538cc888522bad5a05470e8a0796ac9a77d033fbc7b7f541bea7"} Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.692741 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.853540 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory\") pod \"4e61f8b4-0263-4224-a5e0-b34740fbca06\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.853948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxl8p\" (UniqueName: \"kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p\") pod \"4e61f8b4-0263-4224-a5e0-b34740fbca06\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.854032 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam\") pod \"4e61f8b4-0263-4224-a5e0-b34740fbca06\" (UID: \"4e61f8b4-0263-4224-a5e0-b34740fbca06\") " Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.861285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p" (OuterVolumeSpecName: "kube-api-access-dxl8p") pod "4e61f8b4-0263-4224-a5e0-b34740fbca06" (UID: "4e61f8b4-0263-4224-a5e0-b34740fbca06"). InnerVolumeSpecName "kube-api-access-dxl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.884354 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4e61f8b4-0263-4224-a5e0-b34740fbca06" (UID: "4e61f8b4-0263-4224-a5e0-b34740fbca06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.892704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory" (OuterVolumeSpecName: "inventory") pod "4e61f8b4-0263-4224-a5e0-b34740fbca06" (UID: "4e61f8b4-0263-4224-a5e0-b34740fbca06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.958176 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.958225 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e61f8b4-0263-4224-a5e0-b34740fbca06-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:45 crc kubenswrapper[4790]: I0406 12:29:45.958241 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxl8p\" (UniqueName: \"kubernetes.io/projected/4e61f8b4-0263-4224-a5e0-b34740fbca06-kube-api-access-dxl8p\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.180177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" event={"ID":"4e61f8b4-0263-4224-a5e0-b34740fbca06","Type":"ContainerDied","Data":"a35c3f2d5bd148152760c86a8ee64f3619bc6a3fcd13f74ec8a00759ca624700"} Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.180221 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a35c3f2d5bd148152760c86a8ee64f3619bc6a3fcd13f74ec8a00759ca624700" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.180222 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.287316 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm"] Apr 06 12:29:46 crc kubenswrapper[4790]: E0406 12:29:46.288318 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e61f8b4-0263-4224-a5e0-b34740fbca06" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.288372 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e61f8b4-0263-4224-a5e0-b34740fbca06" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.288984 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e61f8b4-0263-4224-a5e0-b34740fbca06" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.290617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.295673 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.297240 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.297579 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.297667 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.302421 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm"] Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.476262 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkc6s\" (UniqueName: \"kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.476545 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.477132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.579817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkc6s\" (UniqueName: \"kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.579920 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.580010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.584746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.585219 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.596080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkc6s\" (UniqueName: \"kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pxptm\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:46 crc kubenswrapper[4790]: I0406 12:29:46.628214 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:47 crc kubenswrapper[4790]: I0406 12:29:47.209632 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm"] Apr 06 12:29:48 crc kubenswrapper[4790]: I0406 12:29:48.213729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" event={"ID":"8e75f387-926a-41f4-8367-8c68d2637c04","Type":"ContainerStarted","Data":"23470d386a937d672ddfe2928d88240b18821debc7e6fc560891f0c04dd9b038"} Apr 06 12:29:48 crc kubenswrapper[4790]: I0406 12:29:48.214123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" event={"ID":"8e75f387-926a-41f4-8367-8c68d2637c04","Type":"ContainerStarted","Data":"c6534d29b8109952375ca3f6e20a600db4a7c2d522d4831a8c3afeee6a0a1845"} Apr 06 12:29:48 crc kubenswrapper[4790]: I0406 12:29:48.246390 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" podStartSLOduration=1.658550919 podStartE2EDuration="2.246372339s" podCreationTimestamp="2026-04-06 12:29:46 +0000 UTC" firstStartedPulling="2026-04-06 12:29:47.21606118 +0000 UTC m=+1966.203804056" lastFinishedPulling="2026-04-06 12:29:47.8038826 +0000 UTC m=+1966.791625476" observedRunningTime="2026-04-06 12:29:48.23606387 +0000 UTC m=+1967.223806736" watchObservedRunningTime="2026-04-06 12:29:48.246372339 +0000 UTC m=+1967.234115205" Apr 06 12:29:53 crc kubenswrapper[4790]: I0406 12:29:53.267983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" event={"ID":"8e75f387-926a-41f4-8367-8c68d2637c04","Type":"ContainerDied","Data":"23470d386a937d672ddfe2928d88240b18821debc7e6fc560891f0c04dd9b038"} Apr 06 12:29:53 crc kubenswrapper[4790]: I0406 12:29:53.267805 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e75f387-926a-41f4-8367-8c68d2637c04" containerID="23470d386a937d672ddfe2928d88240b18821debc7e6fc560891f0c04dd9b038" exitCode=0 Apr 06 12:29:53 crc kubenswrapper[4790]: I0406 12:29:53.676199 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:29:53 crc kubenswrapper[4790]: E0406 12:29:53.676779 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.832699 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.849704 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkc6s\" (UniqueName: \"kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s\") pod \"8e75f387-926a-41f4-8367-8c68d2637c04\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.849960 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory\") pod \"8e75f387-926a-41f4-8367-8c68d2637c04\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.850061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam\") pod \"8e75f387-926a-41f4-8367-8c68d2637c04\" (UID: \"8e75f387-926a-41f4-8367-8c68d2637c04\") " Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.856658 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s" (OuterVolumeSpecName: "kube-api-access-gkc6s") pod "8e75f387-926a-41f4-8367-8c68d2637c04" (UID: "8e75f387-926a-41f4-8367-8c68d2637c04"). InnerVolumeSpecName "kube-api-access-gkc6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.889464 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e75f387-926a-41f4-8367-8c68d2637c04" (UID: "8e75f387-926a-41f4-8367-8c68d2637c04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.906445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory" (OuterVolumeSpecName: "inventory") pod "8e75f387-926a-41f4-8367-8c68d2637c04" (UID: "8e75f387-926a-41f4-8367-8c68d2637c04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.951964 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.952000 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e75f387-926a-41f4-8367-8c68d2637c04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:54 crc kubenswrapper[4790]: I0406 12:29:54.952012 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkc6s\" (UniqueName: \"kubernetes.io/projected/8e75f387-926a-41f4-8367-8c68d2637c04-kube-api-access-gkc6s\") on node \"crc\" DevicePath \"\"" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.292440 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" event={"ID":"8e75f387-926a-41f4-8367-8c68d2637c04","Type":"ContainerDied","Data":"c6534d29b8109952375ca3f6e20a600db4a7c2d522d4831a8c3afeee6a0a1845"} Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.292755 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6534d29b8109952375ca3f6e20a600db4a7c2d522d4831a8c3afeee6a0a1845" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.292811 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pxptm" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.366635 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d"] Apr 06 12:29:55 crc kubenswrapper[4790]: E0406 12:29:55.367097 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e75f387-926a-41f4-8367-8c68d2637c04" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.367115 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e75f387-926a-41f4-8367-8c68d2637c04" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.367305 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e75f387-926a-41f4-8367-8c68d2637c04" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.367973 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.370685 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.370773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.370782 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.371513 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.375223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d"] Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.460476 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7f6\" (UniqueName: \"kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.460624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.460688 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.561693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.561776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.561846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7f6\" (UniqueName: \"kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.568252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.580094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.588043 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7f6\" (UniqueName: \"kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fph2d\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:55 crc kubenswrapper[4790]: I0406 12:29:55.686652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:29:56 crc kubenswrapper[4790]: I0406 12:29:56.037744 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d"] Apr 06 12:29:56 crc kubenswrapper[4790]: I0406 12:29:56.301696 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" event={"ID":"500a529e-70f2-4749-8364-d6a6230b0030","Type":"ContainerStarted","Data":"578d506126793f525c9a995de7d220b8780f7f997aaafa97cb6b8679fe099f2e"} Apr 06 12:29:57 crc kubenswrapper[4790]: I0406 12:29:57.313808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" event={"ID":"500a529e-70f2-4749-8364-d6a6230b0030","Type":"ContainerStarted","Data":"6a68a8a7257955cbbc345d1bb9b5411318ca29f699e8ed6f381bfd2ca897d8f4"} Apr 06 12:29:57 crc kubenswrapper[4790]: I0406 12:29:57.335208 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" podStartSLOduration=1.850485594 podStartE2EDuration="2.335188978s" podCreationTimestamp="2026-04-06 12:29:55 +0000 UTC" firstStartedPulling="2026-04-06 12:29:56.045635141 +0000 UTC m=+1975.033378007" lastFinishedPulling="2026-04-06 12:29:56.530338525 +0000 UTC m=+1975.518081391" observedRunningTime="2026-04-06 12:29:57.329657604 +0000 UTC m=+1976.317400480" watchObservedRunningTime="2026-04-06 12:29:57.335188978 +0000 UTC m=+1976.322931854" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.143884 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4"] Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.146257 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.148949 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.149040 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.157849 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.157940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.158024 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.159409 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591310-ws5pl"] Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.161008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.168760 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.168876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591310-ws5pl"] Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.168945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.169054 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.177966 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4"] Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.263356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.263685 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.263769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm5xs\" (UniqueName: \"kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs\") pod \"auto-csr-approver-29591310-ws5pl\" (UID: \"4e2f37ed-b06f-41fd-a7c8-75dc97797570\") " pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.263843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.264915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.273756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.281655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd\") pod \"collect-profiles-29591310-d74b4\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.366158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm5xs\" (UniqueName: \"kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs\") pod \"auto-csr-approver-29591310-ws5pl\" (UID: \"4e2f37ed-b06f-41fd-a7c8-75dc97797570\") " pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.383054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm5xs\" (UniqueName: \"kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs\") pod \"auto-csr-approver-29591310-ws5pl\" (UID: \"4e2f37ed-b06f-41fd-a7c8-75dc97797570\") " pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.473102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.485392 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:00 crc kubenswrapper[4790]: W0406 12:30:00.959579 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e2f37ed_b06f_41fd_a7c8_75dc97797570.slice/crio-cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3 WatchSource:0}: Error finding container cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3: Status 404 returned error can't find the container with id cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3 Apr 06 12:30:00 crc kubenswrapper[4790]: I0406 12:30:00.964072 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591310-ws5pl"] Apr 06 12:30:01 crc kubenswrapper[4790]: I0406 12:30:01.058569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4"] Apr 06 12:30:01 crc kubenswrapper[4790]: I0406 12:30:01.345938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" event={"ID":"4e2f37ed-b06f-41fd-a7c8-75dc97797570","Type":"ContainerStarted","Data":"cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3"} Apr 06 12:30:01 crc kubenswrapper[4790]: I0406 12:30:01.348078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" event={"ID":"68adc8df-106b-4049-82b1-e3ad4c500432","Type":"ContainerStarted","Data":"286661b6448b30627ccf03fb6fc4e3c8e9eda66d2a1b2e4d20379188cddee549"} Apr 06 12:30:01 crc kubenswrapper[4790]: I0406 12:30:01.348126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" event={"ID":"68adc8df-106b-4049-82b1-e3ad4c500432","Type":"ContainerStarted","Data":"141a34eba8a25bddf0b4c7f328884f3af1a68411a91b6389ee8e62112b0e8dd6"} Apr 06 12:30:01 crc kubenswrapper[4790]: I0406 12:30:01.372123 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" podStartSLOduration=1.372101719 podStartE2EDuration="1.372101719s" podCreationTimestamp="2026-04-06 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:30:01.365990279 +0000 UTC m=+1980.353733135" watchObservedRunningTime="2026-04-06 12:30:01.372101719 +0000 UTC m=+1980.359844585" Apr 06 12:30:02 crc kubenswrapper[4790]: I0406 12:30:02.360801 4790 generic.go:334] "Generic (PLEG): container finished" podID="68adc8df-106b-4049-82b1-e3ad4c500432" containerID="286661b6448b30627ccf03fb6fc4e3c8e9eda66d2a1b2e4d20379188cddee549" exitCode=0 Apr 06 12:30:02 crc kubenswrapper[4790]: I0406 12:30:02.360880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" event={"ID":"68adc8df-106b-4049-82b1-e3ad4c500432","Type":"ContainerDied","Data":"286661b6448b30627ccf03fb6fc4e3c8e9eda66d2a1b2e4d20379188cddee549"} Apr 06 12:30:02 crc kubenswrapper[4790]: I0406 12:30:02.364507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" event={"ID":"4e2f37ed-b06f-41fd-a7c8-75dc97797570","Type":"ContainerStarted","Data":"84e0b799ed805c54b484b868eb01db86ef3ee6fad0c0bf2a7dc9f559fc3ae4b6"} Apr 06 12:30:02 crc kubenswrapper[4790]: I0406 12:30:02.395076 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" podStartSLOduration=1.332870143 podStartE2EDuration="2.395016475s" podCreationTimestamp="2026-04-06 12:30:00 +0000 UTC" firstStartedPulling="2026-04-06 12:30:00.962304094 +0000 UTC m=+1979.950046950" lastFinishedPulling="2026-04-06 12:30:02.024450416 +0000 UTC m=+1981.012193282" observedRunningTime="2026-04-06 12:30:02.387299813 +0000 UTC m=+1981.375042679" watchObservedRunningTime="2026-04-06 12:30:02.395016475 +0000 UTC m=+1981.382759351" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.381699 4790 generic.go:334] "Generic (PLEG): container finished" podID="4e2f37ed-b06f-41fd-a7c8-75dc97797570" containerID="84e0b799ed805c54b484b868eb01db86ef3ee6fad0c0bf2a7dc9f559fc3ae4b6" exitCode=0 Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.381808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" event={"ID":"4e2f37ed-b06f-41fd-a7c8-75dc97797570","Type":"ContainerDied","Data":"84e0b799ed805c54b484b868eb01db86ef3ee6fad0c0bf2a7dc9f559fc3ae4b6"} Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.771934 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.837866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume\") pod \"68adc8df-106b-4049-82b1-e3ad4c500432\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.838071 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume\") pod \"68adc8df-106b-4049-82b1-e3ad4c500432\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.838116 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd\") pod \"68adc8df-106b-4049-82b1-e3ad4c500432\" (UID: \"68adc8df-106b-4049-82b1-e3ad4c500432\") " Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.839707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume" (OuterVolumeSpecName: "config-volume") pod "68adc8df-106b-4049-82b1-e3ad4c500432" (UID: "68adc8df-106b-4049-82b1-e3ad4c500432"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.844263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68adc8df-106b-4049-82b1-e3ad4c500432" (UID: "68adc8df-106b-4049-82b1-e3ad4c500432"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.844371 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd" (OuterVolumeSpecName: "kube-api-access-z6qtd") pod "68adc8df-106b-4049-82b1-e3ad4c500432" (UID: "68adc8df-106b-4049-82b1-e3ad4c500432"). InnerVolumeSpecName "kube-api-access-z6qtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.940495 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68adc8df-106b-4049-82b1-e3ad4c500432-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.940535 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6qtd\" (UniqueName: \"kubernetes.io/projected/68adc8df-106b-4049-82b1-e3ad4c500432-kube-api-access-z6qtd\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:03 crc kubenswrapper[4790]: I0406 12:30:03.940549 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68adc8df-106b-4049-82b1-e3ad4c500432-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.400145 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.404242 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4" event={"ID":"68adc8df-106b-4049-82b1-e3ad4c500432","Type":"ContainerDied","Data":"141a34eba8a25bddf0b4c7f328884f3af1a68411a91b6389ee8e62112b0e8dd6"} Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.404309 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="141a34eba8a25bddf0b4c7f328884f3af1a68411a91b6389ee8e62112b0e8dd6" Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.462774 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k"] Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.475335 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591265-cs56k"] Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.797430 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.858232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm5xs\" (UniqueName: \"kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs\") pod \"4e2f37ed-b06f-41fd-a7c8-75dc97797570\" (UID: \"4e2f37ed-b06f-41fd-a7c8-75dc97797570\") " Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.867443 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs" (OuterVolumeSpecName: "kube-api-access-qm5xs") pod "4e2f37ed-b06f-41fd-a7c8-75dc97797570" (UID: "4e2f37ed-b06f-41fd-a7c8-75dc97797570"). InnerVolumeSpecName "kube-api-access-qm5xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:30:04 crc kubenswrapper[4790]: I0406 12:30:04.961040 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm5xs\" (UniqueName: \"kubernetes.io/projected/4e2f37ed-b06f-41fd-a7c8-75dc97797570-kube-api-access-qm5xs\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.413377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" event={"ID":"4e2f37ed-b06f-41fd-a7c8-75dc97797570","Type":"ContainerDied","Data":"cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3"} Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.414691 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce7d6b25ae66fdd07cab4370b7145c2f6b0020e19262a79ebc0a3a7139076f3" Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.413457 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591310-ws5pl" Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.688556 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42a375c-23c0-471a-8a17-20a03aabc3d4" path="/var/lib/kubelet/pods/e42a375c-23c0-471a-8a17-20a03aabc3d4/volumes" Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.863752 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591304-7wmpw"] Apr 06 12:30:05 crc kubenswrapper[4790]: I0406 12:30:05.873512 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591304-7wmpw"] Apr 06 12:30:07 crc kubenswrapper[4790]: I0406 12:30:07.675400 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:30:07 crc kubenswrapper[4790]: E0406 12:30:07.676000 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:30:07 crc kubenswrapper[4790]: I0406 12:30:07.686345 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776dcf99-f1e6-4524-9f96-d4a99a6967bb" path="/var/lib/kubelet/pods/776dcf99-f1e6-4524-9f96-d4a99a6967bb/volumes" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.440638 4790 scope.go:117] "RemoveContainer" containerID="e71495423c8cb23700b72d79b089de3f471d5881c90fb89a79a65531941fe0af" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.470651 4790 scope.go:117] "RemoveContainer" containerID="4d8b66177ca0666df9c8c5263de19046d9bc7f91b1cb13fa39dde7fbc12caa30" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.550783 4790 scope.go:117] "RemoveContainer" containerID="f5b3c2b9fa81166916164d1a6fea9db181b0d127ff422b62d263430737aef504" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.576295 4790 scope.go:117] "RemoveContainer" containerID="f41db75fa7be42824ce00db32c51b3b2bdf602d880b510f2a59f074ff7a391ec" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.619763 4790 scope.go:117] "RemoveContainer" containerID="88eea0ac20de5a38e6001f6fb2b520abedf4aa0cd8cbce5aff3c09912ede7814" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.678449 4790 scope.go:117] "RemoveContainer" containerID="4fa0f11e13db269fea3ae81a854b65c857be89df17a32203cfac39908bcd84a5" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.707903 4790 scope.go:117] "RemoveContainer" containerID="a6abd7c3e971f649178b72d283380648567082547a463a3d70656f319c38aa1e" Apr 06 12:30:14 crc kubenswrapper[4790]: I0406 12:30:14.727133 4790 scope.go:117] "RemoveContainer" containerID="0ee97cc20094c81566863c1573165cf9e5c2c8f28993300c1653c2761c87dd9a" Apr 06 12:30:21 crc kubenswrapper[4790]: I0406 12:30:21.049361 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kbdtw"] Apr 06 12:30:21 crc kubenswrapper[4790]: I0406 12:30:21.058433 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kbdtw"] Apr 06 12:30:21 crc kubenswrapper[4790]: I0406 12:30:21.685521 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca65c701-253a-49d5-8ca5-82533b5c0995" path="/var/lib/kubelet/pods/ca65c701-253a-49d5-8ca5-82533b5c0995/volumes" Apr 06 12:30:22 crc kubenswrapper[4790]: I0406 12:30:22.676423 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:30:22 crc kubenswrapper[4790]: E0406 12:30:22.677305 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:30:33 crc kubenswrapper[4790]: I0406 12:30:33.710884 4790 generic.go:334] "Generic (PLEG): container finished" podID="500a529e-70f2-4749-8364-d6a6230b0030" containerID="6a68a8a7257955cbbc345d1bb9b5411318ca29f699e8ed6f381bfd2ca897d8f4" exitCode=0 Apr 06 12:30:33 crc kubenswrapper[4790]: I0406 12:30:33.710972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" event={"ID":"500a529e-70f2-4749-8364-d6a6230b0030","Type":"ContainerDied","Data":"6a68a8a7257955cbbc345d1bb9b5411318ca29f699e8ed6f381bfd2ca897d8f4"} Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.168806 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.186822 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory\") pod \"500a529e-70f2-4749-8364-d6a6230b0030\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.186915 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam\") pod \"500a529e-70f2-4749-8364-d6a6230b0030\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.186948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7f6\" (UniqueName: \"kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6\") pod \"500a529e-70f2-4749-8364-d6a6230b0030\" (UID: \"500a529e-70f2-4749-8364-d6a6230b0030\") " Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.194032 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6" (OuterVolumeSpecName: "kube-api-access-hz7f6") pod "500a529e-70f2-4749-8364-d6a6230b0030" (UID: "500a529e-70f2-4749-8364-d6a6230b0030"). InnerVolumeSpecName "kube-api-access-hz7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.224584 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory" (OuterVolumeSpecName: "inventory") pod "500a529e-70f2-4749-8364-d6a6230b0030" (UID: "500a529e-70f2-4749-8364-d6a6230b0030"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.245421 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "500a529e-70f2-4749-8364-d6a6230b0030" (UID: "500a529e-70f2-4749-8364-d6a6230b0030"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.289807 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.289873 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/500a529e-70f2-4749-8364-d6a6230b0030-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.289887 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7f6\" (UniqueName: \"kubernetes.io/projected/500a529e-70f2-4749-8364-d6a6230b0030-kube-api-access-hz7f6\") on node \"crc\" DevicePath \"\"" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.676946 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:30:35 crc kubenswrapper[4790]: E0406 12:30:35.677160 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.733854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" event={"ID":"500a529e-70f2-4749-8364-d6a6230b0030","Type":"ContainerDied","Data":"578d506126793f525c9a995de7d220b8780f7f997aaafa97cb6b8679fe099f2e"} Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.733916 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578d506126793f525c9a995de7d220b8780f7f997aaafa97cb6b8679fe099f2e" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.734223 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fph2d" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.887707 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr"] Apr 06 12:30:35 crc kubenswrapper[4790]: E0406 12:30:35.888151 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68adc8df-106b-4049-82b1-e3ad4c500432" containerName="collect-profiles" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888168 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68adc8df-106b-4049-82b1-e3ad4c500432" containerName="collect-profiles" Apr 06 12:30:35 crc kubenswrapper[4790]: E0406 12:30:35.888185 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500a529e-70f2-4749-8364-d6a6230b0030" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888193 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="500a529e-70f2-4749-8364-d6a6230b0030" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:30:35 crc kubenswrapper[4790]: E0406 12:30:35.888204 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2f37ed-b06f-41fd-a7c8-75dc97797570" containerName="oc" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888210 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2f37ed-b06f-41fd-a7c8-75dc97797570" containerName="oc" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888446 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="500a529e-70f2-4749-8364-d6a6230b0030" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888470 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2f37ed-b06f-41fd-a7c8-75dc97797570" containerName="oc" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.888494 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68adc8df-106b-4049-82b1-e3ad4c500432" containerName="collect-profiles" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.890124 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.896267 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.896920 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.897633 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.897872 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.900360 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr"] Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.902987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.903052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxlx\" (UniqueName: \"kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:35 crc kubenswrapper[4790]: I0406 12:30:35.903089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.004127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.004197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stxlx\" (UniqueName: \"kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.004233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.007646 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.008136 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.025052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxlx\" (UniqueName: \"kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fglxr\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.254032 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:30:36 crc kubenswrapper[4790]: I0406 12:30:36.755095 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr"] Apr 06 12:30:37 crc kubenswrapper[4790]: I0406 12:30:37.756541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" event={"ID":"f694367f-e4c0-49b1-99f2-f22624011595","Type":"ContainerStarted","Data":"5e4220238edadbdc291cc453fe805dc79a3738133e97efafcd6359168dd78640"} Apr 06 12:30:37 crc kubenswrapper[4790]: I0406 12:30:37.757079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" event={"ID":"f694367f-e4c0-49b1-99f2-f22624011595","Type":"ContainerStarted","Data":"866bc8dc2aeb755deb6187f0d976b162bd0e6b5e7e047a726dccf902d720dd1d"} Apr 06 12:30:37 crc kubenswrapper[4790]: I0406 12:30:37.775074 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" podStartSLOduration=2.279113037 podStartE2EDuration="2.775054464s" podCreationTimestamp="2026-04-06 12:30:35 +0000 UTC" firstStartedPulling="2026-04-06 12:30:36.762012087 +0000 UTC m=+2015.749754963" lastFinishedPulling="2026-04-06 12:30:37.257953524 +0000 UTC m=+2016.245696390" observedRunningTime="2026-04-06 12:30:37.774804938 +0000 UTC m=+2016.762547804" watchObservedRunningTime="2026-04-06 12:30:37.775054464 +0000 UTC m=+2016.762797330" Apr 06 12:30:50 crc kubenswrapper[4790]: I0406 12:30:50.675552 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:30:50 crc kubenswrapper[4790]: I0406 12:30:50.895062 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79"} Apr 06 12:31:14 crc kubenswrapper[4790]: I0406 12:31:14.923948 4790 scope.go:117] "RemoveContainer" containerID="d21c59dc518f6b1f44551f6b64b48b7a7d8c6393ba5a8eff31278f1aa2cb411a" Apr 06 12:31:19 crc kubenswrapper[4790]: I0406 12:31:19.045465 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-s89q4"] Apr 06 12:31:19 crc kubenswrapper[4790]: I0406 12:31:19.057463 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-s89q4"] Apr 06 12:31:19 crc kubenswrapper[4790]: I0406 12:31:19.685869 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff20ebb-f795-4d0c-8836-ede867897d49" path="/var/lib/kubelet/pods/6ff20ebb-f795-4d0c-8836-ede867897d49/volumes" Apr 06 12:31:20 crc kubenswrapper[4790]: I0406 12:31:20.031306 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtb78"] Apr 06 12:31:20 crc kubenswrapper[4790]: I0406 12:31:20.040764 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtb78"] Apr 06 12:31:21 crc kubenswrapper[4790]: I0406 12:31:21.687090 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f21193-0d83-4045-9359-ec1228ed6d34" path="/var/lib/kubelet/pods/b1f21193-0d83-4045-9359-ec1228ed6d34/volumes" Apr 06 12:31:27 crc kubenswrapper[4790]: I0406 12:31:27.252911 4790 generic.go:334] "Generic (PLEG): container finished" podID="f694367f-e4c0-49b1-99f2-f22624011595" containerID="5e4220238edadbdc291cc453fe805dc79a3738133e97efafcd6359168dd78640" exitCode=0 Apr 06 12:31:27 crc kubenswrapper[4790]: I0406 12:31:27.253001 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" event={"ID":"f694367f-e4c0-49b1-99f2-f22624011595","Type":"ContainerDied","Data":"5e4220238edadbdc291cc453fe805dc79a3738133e97efafcd6359168dd78640"} Apr 06 12:31:28 crc kubenswrapper[4790]: I0406 12:31:28.807655 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:31:28 crc kubenswrapper[4790]: I0406 12:31:28.962157 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory\") pod \"f694367f-e4c0-49b1-99f2-f22624011595\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " Apr 06 12:31:28 crc kubenswrapper[4790]: I0406 12:31:28.962457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam\") pod \"f694367f-e4c0-49b1-99f2-f22624011595\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " Apr 06 12:31:28 crc kubenswrapper[4790]: I0406 12:31:28.962563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stxlx\" (UniqueName: \"kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx\") pod \"f694367f-e4c0-49b1-99f2-f22624011595\" (UID: \"f694367f-e4c0-49b1-99f2-f22624011595\") " Apr 06 12:31:28 crc kubenswrapper[4790]: I0406 12:31:28.976248 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx" (OuterVolumeSpecName: "kube-api-access-stxlx") pod "f694367f-e4c0-49b1-99f2-f22624011595" (UID: "f694367f-e4c0-49b1-99f2-f22624011595"). InnerVolumeSpecName "kube-api-access-stxlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.002198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f694367f-e4c0-49b1-99f2-f22624011595" (UID: "f694367f-e4c0-49b1-99f2-f22624011595"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.005556 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory" (OuterVolumeSpecName: "inventory") pod "f694367f-e4c0-49b1-99f2-f22624011595" (UID: "f694367f-e4c0-49b1-99f2-f22624011595"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.065306 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.065544 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stxlx\" (UniqueName: \"kubernetes.io/projected/f694367f-e4c0-49b1-99f2-f22624011595-kube-api-access-stxlx\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.065704 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f694367f-e4c0-49b1-99f2-f22624011595-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.274362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" event={"ID":"f694367f-e4c0-49b1-99f2-f22624011595","Type":"ContainerDied","Data":"866bc8dc2aeb755deb6187f0d976b162bd0e6b5e7e047a726dccf902d720dd1d"} Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.274632 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866bc8dc2aeb755deb6187f0d976b162bd0e6b5e7e047a726dccf902d720dd1d" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.274703 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fglxr" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.377433 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w9cm"] Apr 06 12:31:29 crc kubenswrapper[4790]: E0406 12:31:29.378020 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f694367f-e4c0-49b1-99f2-f22624011595" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.378091 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f694367f-e4c0-49b1-99f2-f22624011595" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.378344 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f694367f-e4c0-49b1-99f2-f22624011595" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.379246 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.381600 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.381975 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.382170 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.382404 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.390355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w9cm"] Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.474190 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfxj\" (UniqueName: \"kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.474249 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.474318 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.576247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfxj\" (UniqueName: \"kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.576534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.576642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.583689 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.584118 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.604461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfxj\" (UniqueName: \"kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj\") pod \"ssh-known-hosts-edpm-deployment-2w9cm\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:29 crc kubenswrapper[4790]: I0406 12:31:29.700081 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:30 crc kubenswrapper[4790]: I0406 12:31:30.230533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w9cm"] Apr 06 12:31:30 crc kubenswrapper[4790]: W0406 12:31:30.241032 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b0126b_8513_425d_8079_b68b9cb73bdc.slice/crio-fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa WatchSource:0}: Error finding container fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa: Status 404 returned error can't find the container with id fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa Apr 06 12:31:30 crc kubenswrapper[4790]: I0406 12:31:30.283325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" event={"ID":"28b0126b-8513-425d-8079-b68b9cb73bdc","Type":"ContainerStarted","Data":"fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa"} Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.302618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" event={"ID":"28b0126b-8513-425d-8079-b68b9cb73bdc","Type":"ContainerStarted","Data":"a4a2448487e74ec7718664ee644e9872d7c1d5e598cc031088bcb1fa6f076c9c"} Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.348199 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" podStartSLOduration=1.9443491910000001 podStartE2EDuration="2.348167926s" podCreationTimestamp="2026-04-06 12:31:29 +0000 UTC" firstStartedPulling="2026-04-06 12:31:30.245663445 +0000 UTC m=+2069.233406311" lastFinishedPulling="2026-04-06 12:31:30.64948218 +0000 UTC m=+2069.637225046" observedRunningTime="2026-04-06 12:31:31.332870474 +0000 UTC m=+2070.320613350" watchObservedRunningTime="2026-04-06 12:31:31.348167926 +0000 UTC m=+2070.335910812" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.613641 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.615704 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.639994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.640088 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9f59\" (UniqueName: \"kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.640204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.645595 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.742522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.743316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.743560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.742924 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.745084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9f59\" (UniqueName: \"kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.770565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9f59\" (UniqueName: \"kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59\") pod \"redhat-marketplace-gnbsk\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:31 crc kubenswrapper[4790]: I0406 12:31:31.934912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:32 crc kubenswrapper[4790]: I0406 12:31:32.461685 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:32 crc kubenswrapper[4790]: W0406 12:31:32.471954 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79548e6_839d_41e5_a3df_1ad88f594be2.slice/crio-29e0715b0593c609a99905c4bc917a42ef31a96eea1e1fb8e50f8f281c986da3 WatchSource:0}: Error finding container 29e0715b0593c609a99905c4bc917a42ef31a96eea1e1fb8e50f8f281c986da3: Status 404 returned error can't find the container with id 29e0715b0593c609a99905c4bc917a42ef31a96eea1e1fb8e50f8f281c986da3 Apr 06 12:31:33 crc kubenswrapper[4790]: I0406 12:31:33.327925 4790 generic.go:334] "Generic (PLEG): container finished" podID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerID="d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1" exitCode=0 Apr 06 12:31:33 crc kubenswrapper[4790]: I0406 12:31:33.327993 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerDied","Data":"d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1"} Apr 06 12:31:33 crc kubenswrapper[4790]: I0406 12:31:33.328278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerStarted","Data":"29e0715b0593c609a99905c4bc917a42ef31a96eea1e1fb8e50f8f281c986da3"} Apr 06 12:31:34 crc kubenswrapper[4790]: I0406 12:31:34.342096 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerStarted","Data":"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903"} Apr 06 12:31:35 crc kubenswrapper[4790]: I0406 12:31:35.354057 4790 generic.go:334] "Generic (PLEG): container finished" podID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerID="0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903" exitCode=0 Apr 06 12:31:35 crc kubenswrapper[4790]: I0406 12:31:35.354145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerDied","Data":"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903"} Apr 06 12:31:35 crc kubenswrapper[4790]: I0406 12:31:35.355519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerStarted","Data":"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382"} Apr 06 12:31:35 crc kubenswrapper[4790]: I0406 12:31:35.372548 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gnbsk" podStartSLOduration=2.941403722 podStartE2EDuration="4.372529161s" podCreationTimestamp="2026-04-06 12:31:31 +0000 UTC" firstStartedPulling="2026-04-06 12:31:33.331494122 +0000 UTC m=+2072.319236988" lastFinishedPulling="2026-04-06 12:31:34.762619561 +0000 UTC m=+2073.750362427" observedRunningTime="2026-04-06 12:31:35.369300768 +0000 UTC m=+2074.357043634" watchObservedRunningTime="2026-04-06 12:31:35.372529161 +0000 UTC m=+2074.360272027" Apr 06 12:31:37 crc kubenswrapper[4790]: I0406 12:31:37.373779 4790 generic.go:334] "Generic (PLEG): container finished" podID="28b0126b-8513-425d-8079-b68b9cb73bdc" containerID="a4a2448487e74ec7718664ee644e9872d7c1d5e598cc031088bcb1fa6f076c9c" exitCode=0 Apr 06 12:31:37 crc kubenswrapper[4790]: I0406 12:31:37.373882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" event={"ID":"28b0126b-8513-425d-8079-b68b9cb73bdc","Type":"ContainerDied","Data":"a4a2448487e74ec7718664ee644e9872d7c1d5e598cc031088bcb1fa6f076c9c"} Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.799173 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.888655 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam\") pod \"28b0126b-8513-425d-8079-b68b9cb73bdc\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.888790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0\") pod \"28b0126b-8513-425d-8079-b68b9cb73bdc\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.888889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bfxj\" (UniqueName: \"kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj\") pod \"28b0126b-8513-425d-8079-b68b9cb73bdc\" (UID: \"28b0126b-8513-425d-8079-b68b9cb73bdc\") " Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.896302 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj" (OuterVolumeSpecName: "kube-api-access-4bfxj") pod "28b0126b-8513-425d-8079-b68b9cb73bdc" (UID: "28b0126b-8513-425d-8079-b68b9cb73bdc"). InnerVolumeSpecName "kube-api-access-4bfxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.923146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28b0126b-8513-425d-8079-b68b9cb73bdc" (UID: "28b0126b-8513-425d-8079-b68b9cb73bdc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.927512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "28b0126b-8513-425d-8079-b68b9cb73bdc" (UID: "28b0126b-8513-425d-8079-b68b9cb73bdc"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.990574 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bfxj\" (UniqueName: \"kubernetes.io/projected/28b0126b-8513-425d-8079-b68b9cb73bdc-kube-api-access-4bfxj\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.990604 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:38 crc kubenswrapper[4790]: I0406 12:31:38.990614 4790 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28b0126b-8513-425d-8079-b68b9cb73bdc-inventory-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.394900 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.394888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w9cm" event={"ID":"28b0126b-8513-425d-8079-b68b9cb73bdc","Type":"ContainerDied","Data":"fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa"} Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.395304 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3bddc51faaebba77f7622cae8bc331cf59e1860bdbd7bb6d2c94407b2d48fa" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.471128 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp"] Apr 06 12:31:39 crc kubenswrapper[4790]: E0406 12:31:39.471562 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b0126b-8513-425d-8079-b68b9cb73bdc" containerName="ssh-known-hosts-edpm-deployment" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.471577 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b0126b-8513-425d-8079-b68b9cb73bdc" containerName="ssh-known-hosts-edpm-deployment" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.471793 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b0126b-8513-425d-8079-b68b9cb73bdc" containerName="ssh-known-hosts-edpm-deployment" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.472584 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.477086 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.477333 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.477632 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.479697 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.483122 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp"] Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.497083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.497137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.497257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mr88\" (UniqueName: \"kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.598708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.599008 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.599369 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mr88\" (UniqueName: \"kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.609744 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.609762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.616237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mr88\" (UniqueName: \"kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pmnvp\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:39 crc kubenswrapper[4790]: I0406 12:31:39.805026 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:40 crc kubenswrapper[4790]: I0406 12:31:40.369232 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp"] Apr 06 12:31:40 crc kubenswrapper[4790]: I0406 12:31:40.407878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" event={"ID":"2cea3a85-493c-4732-9954-6a690708c4d1","Type":"ContainerStarted","Data":"5a14730fe6df067359c1def4a487de0cd2153cdc0d146b35c562e92cdeee5027"} Apr 06 12:31:41 crc kubenswrapper[4790]: I0406 12:31:41.441083 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" event={"ID":"2cea3a85-493c-4732-9954-6a690708c4d1","Type":"ContainerStarted","Data":"173863cc953822bc39a4ddc3430dcd151574b69d9c0942c431669eb00a0cbc17"} Apr 06 12:31:41 crc kubenswrapper[4790]: I0406 12:31:41.464821 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" podStartSLOduration=1.993969989 podStartE2EDuration="2.464803243s" podCreationTimestamp="2026-04-06 12:31:39 +0000 UTC" firstStartedPulling="2026-04-06 12:31:40.360784944 +0000 UTC m=+2079.348527810" lastFinishedPulling="2026-04-06 12:31:40.831618168 +0000 UTC m=+2079.819361064" observedRunningTime="2026-04-06 12:31:41.456654482 +0000 UTC m=+2080.444397368" watchObservedRunningTime="2026-04-06 12:31:41.464803243 +0000 UTC m=+2080.452546099" Apr 06 12:31:41 crc kubenswrapper[4790]: I0406 12:31:41.935682 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:41 crc kubenswrapper[4790]: I0406 12:31:41.937370 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:41 crc kubenswrapper[4790]: I0406 12:31:41.989673 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:42 crc kubenswrapper[4790]: I0406 12:31:42.525203 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:42 crc kubenswrapper[4790]: I0406 12:31:42.599220 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:44 crc kubenswrapper[4790]: I0406 12:31:44.474596 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gnbsk" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="registry-server" containerID="cri-o://794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382" gracePeriod=2 Apr 06 12:31:44 crc kubenswrapper[4790]: I0406 12:31:44.919724 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.029854 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities\") pod \"c79548e6-839d-41e5-a3df-1ad88f594be2\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.030239 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9f59\" (UniqueName: \"kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59\") pod \"c79548e6-839d-41e5-a3df-1ad88f594be2\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.030335 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content\") pod \"c79548e6-839d-41e5-a3df-1ad88f594be2\" (UID: \"c79548e6-839d-41e5-a3df-1ad88f594be2\") " Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.030418 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities" (OuterVolumeSpecName: "utilities") pod "c79548e6-839d-41e5-a3df-1ad88f594be2" (UID: "c79548e6-839d-41e5-a3df-1ad88f594be2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.030766 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.036042 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59" (OuterVolumeSpecName: "kube-api-access-l9f59") pod "c79548e6-839d-41e5-a3df-1ad88f594be2" (UID: "c79548e6-839d-41e5-a3df-1ad88f594be2"). InnerVolumeSpecName "kube-api-access-l9f59". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.054698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c79548e6-839d-41e5-a3df-1ad88f594be2" (UID: "c79548e6-839d-41e5-a3df-1ad88f594be2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.133013 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79548e6-839d-41e5-a3df-1ad88f594be2-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.133052 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9f59\" (UniqueName: \"kubernetes.io/projected/c79548e6-839d-41e5-a3df-1ad88f594be2-kube-api-access-l9f59\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.486757 4790 generic.go:334] "Generic (PLEG): container finished" podID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerID="794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382" exitCode=0 Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.486797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerDied","Data":"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382"} Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.486821 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gnbsk" event={"ID":"c79548e6-839d-41e5-a3df-1ad88f594be2","Type":"ContainerDied","Data":"29e0715b0593c609a99905c4bc917a42ef31a96eea1e1fb8e50f8f281c986da3"} Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.486856 4790 scope.go:117] "RemoveContainer" containerID="794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.486964 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gnbsk" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.528640 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.540761 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gnbsk"] Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.548136 4790 scope.go:117] "RemoveContainer" containerID="0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.585528 4790 scope.go:117] "RemoveContainer" containerID="d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.626652 4790 scope.go:117] "RemoveContainer" containerID="794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382" Apr 06 12:31:45 crc kubenswrapper[4790]: E0406 12:31:45.627178 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382\": container with ID starting with 794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382 not found: ID does not exist" containerID="794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.627209 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382"} err="failed to get container status \"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382\": rpc error: code = NotFound desc = could not find container \"794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382\": container with ID starting with 794ba02e8a8b2c150e14286744cddc45c37911bec0e7487dcfb309307dc0d382 not found: ID does not exist" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.627232 4790 scope.go:117] "RemoveContainer" containerID="0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903" Apr 06 12:31:45 crc kubenswrapper[4790]: E0406 12:31:45.627739 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903\": container with ID starting with 0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903 not found: ID does not exist" containerID="0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.627772 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903"} err="failed to get container status \"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903\": rpc error: code = NotFound desc = could not find container \"0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903\": container with ID starting with 0fb73527cf4eec75c98c44bfe22c9a1ac20e044f71d4632e96f85b9856741903 not found: ID does not exist" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.627787 4790 scope.go:117] "RemoveContainer" containerID="d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1" Apr 06 12:31:45 crc kubenswrapper[4790]: E0406 12:31:45.628249 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1\": container with ID starting with d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1 not found: ID does not exist" containerID="d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.628370 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1"} err="failed to get container status \"d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1\": rpc error: code = NotFound desc = could not find container \"d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1\": container with ID starting with d184fc3a554ce5c95132bda9816c9e2b7d9fe508eb6c3a2f81cba7ef1120bfe1 not found: ID does not exist" Apr 06 12:31:45 crc kubenswrapper[4790]: I0406 12:31:45.688159 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" path="/var/lib/kubelet/pods/c79548e6-839d-41e5-a3df-1ad88f594be2/volumes" Apr 06 12:31:49 crc kubenswrapper[4790]: I0406 12:31:49.524310 4790 generic.go:334] "Generic (PLEG): container finished" podID="2cea3a85-493c-4732-9954-6a690708c4d1" containerID="173863cc953822bc39a4ddc3430dcd151574b69d9c0942c431669eb00a0cbc17" exitCode=0 Apr 06 12:31:49 crc kubenswrapper[4790]: I0406 12:31:49.524377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" event={"ID":"2cea3a85-493c-4732-9954-6a690708c4d1","Type":"ContainerDied","Data":"173863cc953822bc39a4ddc3430dcd151574b69d9c0942c431669eb00a0cbc17"} Apr 06 12:31:50 crc kubenswrapper[4790]: I0406 12:31:50.914332 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:50 crc kubenswrapper[4790]: I0406 12:31:50.976252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mr88\" (UniqueName: \"kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88\") pod \"2cea3a85-493c-4732-9954-6a690708c4d1\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " Apr 06 12:31:50 crc kubenswrapper[4790]: I0406 12:31:50.976386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam\") pod \"2cea3a85-493c-4732-9954-6a690708c4d1\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " Apr 06 12:31:50 crc kubenswrapper[4790]: I0406 12:31:50.976451 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory\") pod \"2cea3a85-493c-4732-9954-6a690708c4d1\" (UID: \"2cea3a85-493c-4732-9954-6a690708c4d1\") " Apr 06 12:31:50 crc kubenswrapper[4790]: I0406 12:31:50.982930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88" (OuterVolumeSpecName: "kube-api-access-6mr88") pod "2cea3a85-493c-4732-9954-6a690708c4d1" (UID: "2cea3a85-493c-4732-9954-6a690708c4d1"). InnerVolumeSpecName "kube-api-access-6mr88". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.014850 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory" (OuterVolumeSpecName: "inventory") pod "2cea3a85-493c-4732-9954-6a690708c4d1" (UID: "2cea3a85-493c-4732-9954-6a690708c4d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.020378 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2cea3a85-493c-4732-9954-6a690708c4d1" (UID: "2cea3a85-493c-4732-9954-6a690708c4d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.078947 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mr88\" (UniqueName: \"kubernetes.io/projected/2cea3a85-493c-4732-9954-6a690708c4d1-kube-api-access-6mr88\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.078982 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.078992 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cea3a85-493c-4732-9954-6a690708c4d1-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.548536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" event={"ID":"2cea3a85-493c-4732-9954-6a690708c4d1","Type":"ContainerDied","Data":"5a14730fe6df067359c1def4a487de0cd2153cdc0d146b35c562e92cdeee5027"} Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.548888 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a14730fe6df067359c1def4a487de0cd2153cdc0d146b35c562e92cdeee5027" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.548918 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pmnvp" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.628779 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s"] Apr 06 12:31:51 crc kubenswrapper[4790]: E0406 12:31:51.629292 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="extract-content" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629313 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="extract-content" Apr 06 12:31:51 crc kubenswrapper[4790]: E0406 12:31:51.629340 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="extract-utilities" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629349 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="extract-utilities" Apr 06 12:31:51 crc kubenswrapper[4790]: E0406 12:31:51.629369 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cea3a85-493c-4732-9954-6a690708c4d1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629379 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cea3a85-493c-4732-9954-6a690708c4d1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:51 crc kubenswrapper[4790]: E0406 12:31:51.629398 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="registry-server" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629408 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="registry-server" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629657 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cea3a85-493c-4732-9954-6a690708c4d1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.629673 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79548e6-839d-41e5-a3df-1ad88f594be2" containerName="registry-server" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.630466 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.639870 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s"] Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.659608 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.659650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.659691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.659811 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.691469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5ssq\" (UniqueName: \"kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.691520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.691736 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.794042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5ssq\" (UniqueName: \"kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.794086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.794265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.798434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.798662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.809959 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5ssq\" (UniqueName: \"kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:51 crc kubenswrapper[4790]: I0406 12:31:51.981908 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:31:52 crc kubenswrapper[4790]: I0406 12:31:52.474921 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s"] Apr 06 12:31:52 crc kubenswrapper[4790]: I0406 12:31:52.558867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" event={"ID":"822b5a1d-5bf4-4e66-87fa-20a47f8cd280","Type":"ContainerStarted","Data":"ba4d7ec4c6783045f883f6963cfc0d787ed097171c7722bbcd78f5118a8bbeeb"} Apr 06 12:31:53 crc kubenswrapper[4790]: I0406 12:31:53.577926 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" event={"ID":"822b5a1d-5bf4-4e66-87fa-20a47f8cd280","Type":"ContainerStarted","Data":"97f15ee6ed254380557aa3f061bcf15206ec7cc0e861c0f72c3d261432960ded"} Apr 06 12:31:53 crc kubenswrapper[4790]: I0406 12:31:53.601078 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" podStartSLOduration=2.011452088 podStartE2EDuration="2.601054443s" podCreationTimestamp="2026-04-06 12:31:51 +0000 UTC" firstStartedPulling="2026-04-06 12:31:52.476660535 +0000 UTC m=+2091.464403411" lastFinishedPulling="2026-04-06 12:31:53.06626291 +0000 UTC m=+2092.054005766" observedRunningTime="2026-04-06 12:31:53.595640363 +0000 UTC m=+2092.583383229" watchObservedRunningTime="2026-04-06 12:31:53.601054443 +0000 UTC m=+2092.588797309" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.152373 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591312-ppb7c"] Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.154697 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.157633 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.157733 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.157811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.169567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591312-ppb7c"] Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.261589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fj6\" (UniqueName: \"kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6\") pod \"auto-csr-approver-29591312-ppb7c\" (UID: \"6911cb4d-2dc4-40d9-8459-2b335752b5cb\") " pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.367466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fj6\" (UniqueName: \"kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6\") pod \"auto-csr-approver-29591312-ppb7c\" (UID: \"6911cb4d-2dc4-40d9-8459-2b335752b5cb\") " pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.394598 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fj6\" (UniqueName: \"kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6\") pod \"auto-csr-approver-29591312-ppb7c\" (UID: \"6911cb4d-2dc4-40d9-8459-2b335752b5cb\") " pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.475232 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.960409 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591312-ppb7c"] Apr 06 12:32:00 crc kubenswrapper[4790]: I0406 12:32:00.964182 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:32:01 crc kubenswrapper[4790]: I0406 12:32:01.665726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" event={"ID":"6911cb4d-2dc4-40d9-8459-2b335752b5cb","Type":"ContainerStarted","Data":"445bf28deeb74dcecb5f6934eefe58beb8138165973649c7e67f23bd0ff8bb78"} Apr 06 12:32:02 crc kubenswrapper[4790]: E0406 12:32:02.260055 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6911cb4d_2dc4_40d9_8459_2b335752b5cb.slice/crio-0c80affad409c38d4bcaff0e4a853f4847312868952ecdc69d05e2a7f3e261f5.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:32:02 crc kubenswrapper[4790]: I0406 12:32:02.681172 4790 generic.go:334] "Generic (PLEG): container finished" podID="822b5a1d-5bf4-4e66-87fa-20a47f8cd280" containerID="97f15ee6ed254380557aa3f061bcf15206ec7cc0e861c0f72c3d261432960ded" exitCode=0 Apr 06 12:32:02 crc kubenswrapper[4790]: I0406 12:32:02.681209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" event={"ID":"822b5a1d-5bf4-4e66-87fa-20a47f8cd280","Type":"ContainerDied","Data":"97f15ee6ed254380557aa3f061bcf15206ec7cc0e861c0f72c3d261432960ded"} Apr 06 12:32:02 crc kubenswrapper[4790]: I0406 12:32:02.683286 4790 generic.go:334] "Generic (PLEG): container finished" podID="6911cb4d-2dc4-40d9-8459-2b335752b5cb" containerID="0c80affad409c38d4bcaff0e4a853f4847312868952ecdc69d05e2a7f3e261f5" exitCode=0 Apr 06 12:32:02 crc kubenswrapper[4790]: I0406 12:32:02.683311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" event={"ID":"6911cb4d-2dc4-40d9-8459-2b335752b5cb","Type":"ContainerDied","Data":"0c80affad409c38d4bcaff0e4a853f4847312868952ecdc69d05e2a7f3e261f5"} Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.054004 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-lsjv7"] Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.068484 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-lsjv7"] Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.181205 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.188628 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.247579 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8fj6\" (UniqueName: \"kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6\") pod \"6911cb4d-2dc4-40d9-8459-2b335752b5cb\" (UID: \"6911cb4d-2dc4-40d9-8459-2b335752b5cb\") " Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.247776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory\") pod \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.250578 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam\") pod \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.250708 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5ssq\" (UniqueName: \"kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq\") pod \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\" (UID: \"822b5a1d-5bf4-4e66-87fa-20a47f8cd280\") " Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.272096 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:04 crc kubenswrapper[4790]: E0406 12:32:04.272555 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6911cb4d-2dc4-40d9-8459-2b335752b5cb" containerName="oc" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.272568 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6911cb4d-2dc4-40d9-8459-2b335752b5cb" containerName="oc" Apr 06 12:32:04 crc kubenswrapper[4790]: E0406 12:32:04.272599 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822b5a1d-5bf4-4e66-87fa-20a47f8cd280" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.272606 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="822b5a1d-5bf4-4e66-87fa-20a47f8cd280" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.272795 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="822b5a1d-5bf4-4e66-87fa-20a47f8cd280" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.272860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6911cb4d-2dc4-40d9-8459-2b335752b5cb" containerName="oc" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.274610 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.276208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6" (OuterVolumeSpecName: "kube-api-access-s8fj6") pod "6911cb4d-2dc4-40d9-8459-2b335752b5cb" (UID: "6911cb4d-2dc4-40d9-8459-2b335752b5cb"). InnerVolumeSpecName "kube-api-access-s8fj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.289292 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq" (OuterVolumeSpecName: "kube-api-access-k5ssq") pod "822b5a1d-5bf4-4e66-87fa-20a47f8cd280" (UID: "822b5a1d-5bf4-4e66-87fa-20a47f8cd280"). InnerVolumeSpecName "kube-api-access-k5ssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.292927 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.308811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory" (OuterVolumeSpecName: "inventory") pod "822b5a1d-5bf4-4e66-87fa-20a47f8cd280" (UID: "822b5a1d-5bf4-4e66-87fa-20a47f8cd280"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.325087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "822b5a1d-5bf4-4e66-87fa-20a47f8cd280" (UID: "822b5a1d-5bf4-4e66-87fa-20a47f8cd280"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkgm\" (UniqueName: \"kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355318 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5ssq\" (UniqueName: \"kubernetes.io/projected/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-kube-api-access-k5ssq\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355331 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8fj6\" (UniqueName: \"kubernetes.io/projected/6911cb4d-2dc4-40d9-8459-2b335752b5cb-kube-api-access-s8fj6\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355341 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.355351 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/822b5a1d-5bf4-4e66-87fa-20a47f8cd280-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.456764 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.456844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.456932 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkgm\" (UniqueName: \"kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.457461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.457525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.474060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkgm\" (UniqueName: \"kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm\") pod \"redhat-operators-hpx85\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.695399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.707773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" event={"ID":"6911cb4d-2dc4-40d9-8459-2b335752b5cb","Type":"ContainerDied","Data":"445bf28deeb74dcecb5f6934eefe58beb8138165973649c7e67f23bd0ff8bb78"} Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.707808 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="445bf28deeb74dcecb5f6934eefe58beb8138165973649c7e67f23bd0ff8bb78" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.707879 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591312-ppb7c" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.709804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" event={"ID":"822b5a1d-5bf4-4e66-87fa-20a47f8cd280","Type":"ContainerDied","Data":"ba4d7ec4c6783045f883f6963cfc0d787ed097171c7722bbcd78f5118a8bbeeb"} Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.709869 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4d7ec4c6783045f883f6963cfc0d787ed097171c7722bbcd78f5118a8bbeeb" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.709939 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.798810 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9"] Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.800148 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.803752 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804186 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804241 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804429 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804778 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.804801 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.809648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9"] Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865544 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865734 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btpf\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.865862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967769 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967861 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btpf\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967924 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967963 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.967985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.968012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.968034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.968074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.968095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.974974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.975551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.975813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.975937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.976248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.976511 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.976549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.977302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.977596 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.977719 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.978674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.979657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.979732 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:04 crc kubenswrapper[4790]: I0406 12:32:04.986011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btpf\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.134121 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.221155 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.268882 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591306-khl4s"] Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.278372 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591306-khl4s"] Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.688369 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5245510f-0d96-4f12-95ea-d65141bdd2e0" path="/var/lib/kubelet/pods/5245510f-0d96-4f12-95ea-d65141bdd2e0/volumes" Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.689595 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08c5050-9091-47f9-8135-daee3777de99" path="/var/lib/kubelet/pods/f08c5050-9091-47f9-8135-daee3777de99/volumes" Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.729730 4790 generic.go:334] "Generic (PLEG): container finished" podID="006aab53-adef-470a-8dc8-06e0c456f70d" containerID="9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0" exitCode=0 Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.729772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerDied","Data":"9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0"} Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.729796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerStarted","Data":"8ef8989a9ae71a095e938f7d362cab37033bd8904df4970b4618530e65fbb0d2"} Apr 06 12:32:05 crc kubenswrapper[4790]: W0406 12:32:05.851991 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb888063_8b8f_43f5_ba22_d7fc374a1bbf.slice/crio-f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94 WatchSource:0}: Error finding container f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94: Status 404 returned error can't find the container with id f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94 Apr 06 12:32:05 crc kubenswrapper[4790]: I0406 12:32:05.852940 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9"] Apr 06 12:32:06 crc kubenswrapper[4790]: I0406 12:32:06.745555 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerStarted","Data":"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c"} Apr 06 12:32:06 crc kubenswrapper[4790]: I0406 12:32:06.749899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" event={"ID":"eb888063-8b8f-43f5-ba22-d7fc374a1bbf","Type":"ContainerStarted","Data":"30532ae6f3ec152263a467eb66ab1384a6f578c30bb13a673e51d6520e9490bd"} Apr 06 12:32:06 crc kubenswrapper[4790]: I0406 12:32:06.749980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" event={"ID":"eb888063-8b8f-43f5-ba22-d7fc374a1bbf","Type":"ContainerStarted","Data":"f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94"} Apr 06 12:32:06 crc kubenswrapper[4790]: I0406 12:32:06.793299 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" podStartSLOduration=2.358403605 podStartE2EDuration="2.793275228s" podCreationTimestamp="2026-04-06 12:32:04 +0000 UTC" firstStartedPulling="2026-04-06 12:32:05.854506232 +0000 UTC m=+2104.842249098" lastFinishedPulling="2026-04-06 12:32:06.289377815 +0000 UTC m=+2105.277120721" observedRunningTime="2026-04-06 12:32:06.786139473 +0000 UTC m=+2105.773882339" watchObservedRunningTime="2026-04-06 12:32:06.793275228 +0000 UTC m=+2105.781018094" Apr 06 12:32:08 crc kubenswrapper[4790]: I0406 12:32:08.773192 4790 generic.go:334] "Generic (PLEG): container finished" podID="006aab53-adef-470a-8dc8-06e0c456f70d" containerID="0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c" exitCode=0 Apr 06 12:32:08 crc kubenswrapper[4790]: I0406 12:32:08.773282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerDied","Data":"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c"} Apr 06 12:32:09 crc kubenswrapper[4790]: I0406 12:32:09.784250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerStarted","Data":"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7"} Apr 06 12:32:09 crc kubenswrapper[4790]: I0406 12:32:09.808220 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hpx85" podStartSLOduration=2.347458159 podStartE2EDuration="5.808195244s" podCreationTimestamp="2026-04-06 12:32:04 +0000 UTC" firstStartedPulling="2026-04-06 12:32:05.731619476 +0000 UTC m=+2104.719362342" lastFinishedPulling="2026-04-06 12:32:09.192356561 +0000 UTC m=+2108.180099427" observedRunningTime="2026-04-06 12:32:09.800433953 +0000 UTC m=+2108.788176819" watchObservedRunningTime="2026-04-06 12:32:09.808195244 +0000 UTC m=+2108.795938110" Apr 06 12:32:14 crc kubenswrapper[4790]: I0406 12:32:14.696398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:14 crc kubenswrapper[4790]: I0406 12:32:14.696944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:15 crc kubenswrapper[4790]: I0406 12:32:15.041902 4790 scope.go:117] "RemoveContainer" containerID="37d97592f7406c9d4fc98603e8b17536cd78e6d94bea0809d43be773c95b9706" Apr 06 12:32:15 crc kubenswrapper[4790]: I0406 12:32:15.090274 4790 scope.go:117] "RemoveContainer" containerID="8bb6d004bb3e2a1d714b20f740824914099e2ff479a2ae3efb76a449c125e02a" Apr 06 12:32:15 crc kubenswrapper[4790]: I0406 12:32:15.166658 4790 scope.go:117] "RemoveContainer" containerID="f73fd47ecb6bb249b4c9063e0e5e2661ae112feb32e11555a04df2d7062365a4" Apr 06 12:32:15 crc kubenswrapper[4790]: I0406 12:32:15.216878 4790 scope.go:117] "RemoveContainer" containerID="d72a2a5c5edc7178eab28e4530d84fed627921c7f7e2ec5a88e9fe84c51aaa26" Apr 06 12:32:15 crc kubenswrapper[4790]: I0406 12:32:15.755270 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hpx85" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="registry-server" probeResult="failure" output=< Apr 06 12:32:15 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:32:15 crc kubenswrapper[4790]: > Apr 06 12:32:24 crc kubenswrapper[4790]: I0406 12:32:24.753174 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:24 crc kubenswrapper[4790]: I0406 12:32:24.807285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:24 crc kubenswrapper[4790]: I0406 12:32:24.990775 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:25 crc kubenswrapper[4790]: I0406 12:32:25.939942 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hpx85" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="registry-server" containerID="cri-o://76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7" gracePeriod=2 Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.400033 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.605968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqkgm\" (UniqueName: \"kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm\") pod \"006aab53-adef-470a-8dc8-06e0c456f70d\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.606311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities\") pod \"006aab53-adef-470a-8dc8-06e0c456f70d\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.606435 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content\") pod \"006aab53-adef-470a-8dc8-06e0c456f70d\" (UID: \"006aab53-adef-470a-8dc8-06e0c456f70d\") " Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.608749 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities" (OuterVolumeSpecName: "utilities") pod "006aab53-adef-470a-8dc8-06e0c456f70d" (UID: "006aab53-adef-470a-8dc8-06e0c456f70d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.625207 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm" (OuterVolumeSpecName: "kube-api-access-nqkgm") pod "006aab53-adef-470a-8dc8-06e0c456f70d" (UID: "006aab53-adef-470a-8dc8-06e0c456f70d"). InnerVolumeSpecName "kube-api-access-nqkgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.709331 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqkgm\" (UniqueName: \"kubernetes.io/projected/006aab53-adef-470a-8dc8-06e0c456f70d-kube-api-access-nqkgm\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.709364 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.758568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "006aab53-adef-470a-8dc8-06e0c456f70d" (UID: "006aab53-adef-470a-8dc8-06e0c456f70d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.811144 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/006aab53-adef-470a-8dc8-06e0c456f70d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.951078 4790 generic.go:334] "Generic (PLEG): container finished" podID="006aab53-adef-470a-8dc8-06e0c456f70d" containerID="76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7" exitCode=0 Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.951128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerDied","Data":"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7"} Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.951167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpx85" event={"ID":"006aab53-adef-470a-8dc8-06e0c456f70d","Type":"ContainerDied","Data":"8ef8989a9ae71a095e938f7d362cab37033bd8904df4970b4618530e65fbb0d2"} Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.951202 4790 scope.go:117] "RemoveContainer" containerID="76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.951166 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpx85" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.972119 4790 scope.go:117] "RemoveContainer" containerID="0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c" Apr 06 12:32:26 crc kubenswrapper[4790]: I0406 12:32:26.994862 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.009667 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hpx85"] Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.009751 4790 scope.go:117] "RemoveContainer" containerID="9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.048110 4790 scope.go:117] "RemoveContainer" containerID="76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7" Apr 06 12:32:27 crc kubenswrapper[4790]: E0406 12:32:27.048607 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7\": container with ID starting with 76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7 not found: ID does not exist" containerID="76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.048672 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7"} err="failed to get container status \"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7\": rpc error: code = NotFound desc = could not find container \"76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7\": container with ID starting with 76b11ddbb3d722831f2b7e01c7e92379295af7d2ad8828daa6db9c205f2d1ff7 not found: ID does not exist" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.048704 4790 scope.go:117] "RemoveContainer" containerID="0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c" Apr 06 12:32:27 crc kubenswrapper[4790]: E0406 12:32:27.048971 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c\": container with ID starting with 0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c not found: ID does not exist" containerID="0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.049012 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c"} err="failed to get container status \"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c\": rpc error: code = NotFound desc = could not find container \"0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c\": container with ID starting with 0f4b1be3d4cc6e14e4d2b960544289d7bdb6d325056783bb370ee92b73d9984c not found: ID does not exist" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.049032 4790 scope.go:117] "RemoveContainer" containerID="9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0" Apr 06 12:32:27 crc kubenswrapper[4790]: E0406 12:32:27.049353 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0\": container with ID starting with 9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0 not found: ID does not exist" containerID="9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.049381 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0"} err="failed to get container status \"9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0\": rpc error: code = NotFound desc = could not find container \"9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0\": container with ID starting with 9696022e10eb19b7e7c752213da43a825ce02c68c9e03dcc2546bdb8501f0af0 not found: ID does not exist" Apr 06 12:32:27 crc kubenswrapper[4790]: I0406 12:32:27.686630 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" path="/var/lib/kubelet/pods/006aab53-adef-470a-8dc8-06e0c456f70d/volumes" Apr 06 12:32:43 crc kubenswrapper[4790]: I0406 12:32:43.121400 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb888063-8b8f-43f5-ba22-d7fc374a1bbf" containerID="30532ae6f3ec152263a467eb66ab1384a6f578c30bb13a673e51d6520e9490bd" exitCode=0 Apr 06 12:32:43 crc kubenswrapper[4790]: I0406 12:32:43.122008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" event={"ID":"eb888063-8b8f-43f5-ba22-d7fc374a1bbf","Type":"ContainerDied","Data":"30532ae6f3ec152263a467eb66ab1384a6f578c30bb13a673e51d6520e9490bd"} Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.558808 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.713756 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.713853 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.713933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714498 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714522 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9btpf\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714638 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714784 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.714917 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle\") pod \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\" (UID: \"eb888063-8b8f-43f5-ba22-d7fc374a1bbf\") " Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.719162 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.719600 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf" (OuterVolumeSpecName: "kube-api-access-9btpf") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "kube-api-access-9btpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.720573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.721359 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.722030 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.722358 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.722363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.723118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.724441 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.726817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.728067 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.738186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.750016 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory" (OuterVolumeSpecName: "inventory") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.754507 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eb888063-8b8f-43f5-ba22-d7fc374a1bbf" (UID: "eb888063-8b8f-43f5-ba22-d7fc374a1bbf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816566 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816612 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816631 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816645 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816658 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816667 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816678 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816686 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816695 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816704 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816711 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816719 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816730 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:44 crc kubenswrapper[4790]: I0406 12:32:44.816740 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9btpf\" (UniqueName: \"kubernetes.io/projected/eb888063-8b8f-43f5-ba22-d7fc374a1bbf-kube-api-access-9btpf\") on node \"crc\" DevicePath \"\"" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.141534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" event={"ID":"eb888063-8b8f-43f5-ba22-d7fc374a1bbf","Type":"ContainerDied","Data":"f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94"} Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.141581 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8369969ba8b3a9d4aa8bd2994f8837dfc92a138a53891fb044d0eb252730b94" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.141994 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.257924 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5"] Apr 06 12:32:45 crc kubenswrapper[4790]: E0406 12:32:45.258659 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb888063-8b8f-43f5-ba22-d7fc374a1bbf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.258754 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb888063-8b8f-43f5-ba22-d7fc374a1bbf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:45 crc kubenswrapper[4790]: E0406 12:32:45.258900 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="extract-utilities" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.258979 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="extract-utilities" Apr 06 12:32:45 crc kubenswrapper[4790]: E0406 12:32:45.259068 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="extract-content" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.259700 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="extract-content" Apr 06 12:32:45 crc kubenswrapper[4790]: E0406 12:32:45.259802 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="registry-server" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.259894 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="registry-server" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.260260 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="006aab53-adef-470a-8dc8-06e0c456f70d" containerName="registry-server" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.260365 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb888063-8b8f-43f5-ba22-d7fc374a1bbf" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.261311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.264135 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.264343 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.268532 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.269047 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.269427 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.284104 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5"] Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.326617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvnc\" (UniqueName: \"kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.326680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.326704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.326971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.327123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.428518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.428872 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.428937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvnc\" (UniqueName: \"kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.428989 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.429024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.429865 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.434747 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.434920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.438974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.448746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvnc\" (UniqueName: \"kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-nbzm5\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:45 crc kubenswrapper[4790]: I0406 12:32:45.589361 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:32:46 crc kubenswrapper[4790]: I0406 12:32:46.094639 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5"] Apr 06 12:32:46 crc kubenswrapper[4790]: I0406 12:32:46.150853 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" event={"ID":"7809a0fa-81df-4e08-8a8f-84e070582795","Type":"ContainerStarted","Data":"4acba59f9007de9dd1824a559f9954450d1cf5747d7502889785694fab620db0"} Apr 06 12:32:47 crc kubenswrapper[4790]: I0406 12:32:47.162155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" event={"ID":"7809a0fa-81df-4e08-8a8f-84e070582795","Type":"ContainerStarted","Data":"4835e4df5a008000ec45e6ecce6c3105c6cc2e9affb95184aa4f132f04b0130b"} Apr 06 12:32:47 crc kubenswrapper[4790]: I0406 12:32:47.195782 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" podStartSLOduration=1.829243309 podStartE2EDuration="2.195764829s" podCreationTimestamp="2026-04-06 12:32:45 +0000 UTC" firstStartedPulling="2026-04-06 12:32:46.115643479 +0000 UTC m=+2145.103386345" lastFinishedPulling="2026-04-06 12:32:46.482164999 +0000 UTC m=+2145.469907865" observedRunningTime="2026-04-06 12:32:47.190931082 +0000 UTC m=+2146.178673948" watchObservedRunningTime="2026-04-06 12:32:47.195764829 +0000 UTC m=+2146.183507695" Apr 06 12:33:05 crc kubenswrapper[4790]: I0406 12:33:05.998178 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.001509 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.010553 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.078351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.078564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.078667 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.180396 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.180505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.180558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.180906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.181038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.200664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4\") pod \"community-operators-qb6t9\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.330018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:06 crc kubenswrapper[4790]: I0406 12:33:06.947741 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:07 crc kubenswrapper[4790]: I0406 12:33:07.330455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerStarted","Data":"2ecd5d6fa742f252c0b2f077fa1493aed680255311d7e68811b07b6e6ed406b3"} Apr 06 12:33:08 crc kubenswrapper[4790]: I0406 12:33:08.351880 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerID="892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43" exitCode=0 Apr 06 12:33:08 crc kubenswrapper[4790]: I0406 12:33:08.351952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerDied","Data":"892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43"} Apr 06 12:33:09 crc kubenswrapper[4790]: I0406 12:33:09.361760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerStarted","Data":"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e"} Apr 06 12:33:09 crc kubenswrapper[4790]: I0406 12:33:09.753171 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:33:09 crc kubenswrapper[4790]: I0406 12:33:09.753254 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:33:10 crc kubenswrapper[4790]: I0406 12:33:10.374606 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerID="2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e" exitCode=0 Apr 06 12:33:10 crc kubenswrapper[4790]: I0406 12:33:10.374655 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerDied","Data":"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e"} Apr 06 12:33:11 crc kubenswrapper[4790]: I0406 12:33:11.389061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerStarted","Data":"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a"} Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.330820 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.331413 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.378547 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.403659 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qb6t9" podStartSLOduration=9.023281182 podStartE2EDuration="11.403639608s" podCreationTimestamp="2026-04-06 12:33:05 +0000 UTC" firstStartedPulling="2026-04-06 12:33:08.353920783 +0000 UTC m=+2167.341663649" lastFinishedPulling="2026-04-06 12:33:10.734279209 +0000 UTC m=+2169.722022075" observedRunningTime="2026-04-06 12:33:11.411361154 +0000 UTC m=+2170.399104030" watchObservedRunningTime="2026-04-06 12:33:16.403639608 +0000 UTC m=+2175.391382474" Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.479942 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:16 crc kubenswrapper[4790]: I0406 12:33:16.619932 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:18 crc kubenswrapper[4790]: I0406 12:33:18.454817 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qb6t9" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="registry-server" containerID="cri-o://6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a" gracePeriod=2 Apr 06 12:33:18 crc kubenswrapper[4790]: I0406 12:33:18.937350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.047072 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities\") pod \"2f4e993c-c669-4a24-9745-397c723c1fa3\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.047300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4\") pod \"2f4e993c-c669-4a24-9745-397c723c1fa3\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.047343 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content\") pod \"2f4e993c-c669-4a24-9745-397c723c1fa3\" (UID: \"2f4e993c-c669-4a24-9745-397c723c1fa3\") " Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.048271 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities" (OuterVolumeSpecName: "utilities") pod "2f4e993c-c669-4a24-9745-397c723c1fa3" (UID: "2f4e993c-c669-4a24-9745-397c723c1fa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.062971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4" (OuterVolumeSpecName: "kube-api-access-mlkr4") pod "2f4e993c-c669-4a24-9745-397c723c1fa3" (UID: "2f4e993c-c669-4a24-9745-397c723c1fa3"). InnerVolumeSpecName "kube-api-access-mlkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.101000 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f4e993c-c669-4a24-9745-397c723c1fa3" (UID: "2f4e993c-c669-4a24-9745-397c723c1fa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.150030 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlkr4\" (UniqueName: \"kubernetes.io/projected/2f4e993c-c669-4a24-9745-397c723c1fa3-kube-api-access-mlkr4\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.150082 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.150091 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4e993c-c669-4a24-9745-397c723c1fa3-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.464951 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerID="6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a" exitCode=0 Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.465009 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qb6t9" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.465024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerDied","Data":"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a"} Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.465148 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qb6t9" event={"ID":"2f4e993c-c669-4a24-9745-397c723c1fa3","Type":"ContainerDied","Data":"2ecd5d6fa742f252c0b2f077fa1493aed680255311d7e68811b07b6e6ed406b3"} Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.465189 4790 scope.go:117] "RemoveContainer" containerID="6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.486131 4790 scope.go:117] "RemoveContainer" containerID="2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.508919 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.518785 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qb6t9"] Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.535531 4790 scope.go:117] "RemoveContainer" containerID="892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.565237 4790 scope.go:117] "RemoveContainer" containerID="6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a" Apr 06 12:33:19 crc kubenswrapper[4790]: E0406 12:33:19.565749 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a\": container with ID starting with 6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a not found: ID does not exist" containerID="6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.565793 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a"} err="failed to get container status \"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a\": rpc error: code = NotFound desc = could not find container \"6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a\": container with ID starting with 6cc08f81af719c5e9b163a419f5ee98ad479ed50d7b5edd00dc338ded367935a not found: ID does not exist" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.565838 4790 scope.go:117] "RemoveContainer" containerID="2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e" Apr 06 12:33:19 crc kubenswrapper[4790]: E0406 12:33:19.566434 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e\": container with ID starting with 2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e not found: ID does not exist" containerID="2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.566473 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e"} err="failed to get container status \"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e\": rpc error: code = NotFound desc = could not find container \"2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e\": container with ID starting with 2a10d506eea8dee409a57244455f3da806e9d6afbf50e53bc9814cc29ff7651e not found: ID does not exist" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.566496 4790 scope.go:117] "RemoveContainer" containerID="892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43" Apr 06 12:33:19 crc kubenswrapper[4790]: E0406 12:33:19.566785 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43\": container with ID starting with 892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43 not found: ID does not exist" containerID="892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.566812 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43"} err="failed to get container status \"892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43\": rpc error: code = NotFound desc = could not find container \"892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43\": container with ID starting with 892aaf857dee40cf5155c60e320e1b3b8140b904e9b09b472b9b515d0dd9ae43 not found: ID does not exist" Apr 06 12:33:19 crc kubenswrapper[4790]: I0406 12:33:19.686670 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" path="/var/lib/kubelet/pods/2f4e993c-c669-4a24-9745-397c723c1fa3/volumes" Apr 06 12:33:39 crc kubenswrapper[4790]: I0406 12:33:39.753637 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:33:39 crc kubenswrapper[4790]: I0406 12:33:39.754617 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:33:49 crc kubenswrapper[4790]: I0406 12:33:49.791586 4790 generic.go:334] "Generic (PLEG): container finished" podID="7809a0fa-81df-4e08-8a8f-84e070582795" containerID="4835e4df5a008000ec45e6ecce6c3105c6cc2e9affb95184aa4f132f04b0130b" exitCode=0 Apr 06 12:33:49 crc kubenswrapper[4790]: I0406 12:33:49.791712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" event={"ID":"7809a0fa-81df-4e08-8a8f-84e070582795","Type":"ContainerDied","Data":"4835e4df5a008000ec45e6ecce6c3105c6cc2e9affb95184aa4f132f04b0130b"} Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.295259 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.380899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0\") pod \"7809a0fa-81df-4e08-8a8f-84e070582795\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.381043 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxvnc\" (UniqueName: \"kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc\") pod \"7809a0fa-81df-4e08-8a8f-84e070582795\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.381096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle\") pod \"7809a0fa-81df-4e08-8a8f-84e070582795\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.381178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam\") pod \"7809a0fa-81df-4e08-8a8f-84e070582795\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.381211 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory\") pod \"7809a0fa-81df-4e08-8a8f-84e070582795\" (UID: \"7809a0fa-81df-4e08-8a8f-84e070582795\") " Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.389939 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7809a0fa-81df-4e08-8a8f-84e070582795" (UID: "7809a0fa-81df-4e08-8a8f-84e070582795"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.390245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc" (OuterVolumeSpecName: "kube-api-access-kxvnc") pod "7809a0fa-81df-4e08-8a8f-84e070582795" (UID: "7809a0fa-81df-4e08-8a8f-84e070582795"). InnerVolumeSpecName "kube-api-access-kxvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.417776 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "7809a0fa-81df-4e08-8a8f-84e070582795" (UID: "7809a0fa-81df-4e08-8a8f-84e070582795"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.423070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7809a0fa-81df-4e08-8a8f-84e070582795" (UID: "7809a0fa-81df-4e08-8a8f-84e070582795"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.426510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory" (OuterVolumeSpecName: "inventory") pod "7809a0fa-81df-4e08-8a8f-84e070582795" (UID: "7809a0fa-81df-4e08-8a8f-84e070582795"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.483049 4790 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7809a0fa-81df-4e08-8a8f-84e070582795-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.483074 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxvnc\" (UniqueName: \"kubernetes.io/projected/7809a0fa-81df-4e08-8a8f-84e070582795-kube-api-access-kxvnc\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.483083 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.483092 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.483101 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7809a0fa-81df-4e08-8a8f-84e070582795-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.819767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" event={"ID":"7809a0fa-81df-4e08-8a8f-84e070582795","Type":"ContainerDied","Data":"4acba59f9007de9dd1824a559f9954450d1cf5747d7502889785694fab620db0"} Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.820150 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4acba59f9007de9dd1824a559f9954450d1cf5747d7502889785694fab620db0" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.819861 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-nbzm5" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.998250 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4"] Apr 06 12:33:51 crc kubenswrapper[4790]: E0406 12:33:51.998742 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="extract-utilities" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.998767 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="extract-utilities" Apr 06 12:33:51 crc kubenswrapper[4790]: E0406 12:33:51.998795 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="registry-server" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.998804 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="registry-server" Apr 06 12:33:51 crc kubenswrapper[4790]: E0406 12:33:51.998847 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="extract-content" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.998855 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="extract-content" Apr 06 12:33:51 crc kubenswrapper[4790]: E0406 12:33:51.998875 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7809a0fa-81df-4e08-8a8f-84e070582795" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.998882 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7809a0fa-81df-4e08-8a8f-84e070582795" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.999106 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7809a0fa-81df-4e08-8a8f-84e070582795" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Apr 06 12:33:51 crc kubenswrapper[4790]: I0406 12:33:51.999124 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4e993c-c669-4a24-9745-397c723c1fa3" containerName="registry-server" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:51.999955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.002109 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.002254 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.003463 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.003489 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.003549 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.003472 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.019125 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4"] Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095685 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.095742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45547\" (UniqueName: \"kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197404 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.197525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45547\" (UniqueName: \"kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.202337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.203409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.204237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.207438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.207526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.213801 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45547\" (UniqueName: \"kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.323742 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:33:52 crc kubenswrapper[4790]: I0406 12:33:52.885134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4"] Apr 06 12:33:53 crc kubenswrapper[4790]: I0406 12:33:53.852696 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" event={"ID":"9a366552-b5b9-4a9c-92a2-8b63981f5520","Type":"ContainerStarted","Data":"8b9976bada092fed82cdefdca82ef0dd6e6d1be0abffdf1a752ba137de7688c0"} Apr 06 12:33:53 crc kubenswrapper[4790]: I0406 12:33:53.853160 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" event={"ID":"9a366552-b5b9-4a9c-92a2-8b63981f5520","Type":"ContainerStarted","Data":"961da23755254fad934ee10b61aa50d95c9dc9a60e2037193e9468dda4d1a47d"} Apr 06 12:33:53 crc kubenswrapper[4790]: I0406 12:33:53.881899 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" podStartSLOduration=2.357230776 podStartE2EDuration="2.881876427s" podCreationTimestamp="2026-04-06 12:33:51 +0000 UTC" firstStartedPulling="2026-04-06 12:33:52.893694519 +0000 UTC m=+2211.881437395" lastFinishedPulling="2026-04-06 12:33:53.41834018 +0000 UTC m=+2212.406083046" observedRunningTime="2026-04-06 12:33:53.874089752 +0000 UTC m=+2212.861832628" watchObservedRunningTime="2026-04-06 12:33:53.881876427 +0000 UTC m=+2212.869619303" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.154060 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591314-9xsqr"] Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.156186 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.159236 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.159532 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.159696 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hm8\" (UniqueName: \"kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8\") pod \"auto-csr-approver-29591314-9xsqr\" (UID: \"baafe0bb-787b-4172-8a37-1aff17823994\") " pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.162012 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.170532 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591314-9xsqr"] Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.261893 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48hm8\" (UniqueName: \"kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8\") pod \"auto-csr-approver-29591314-9xsqr\" (UID: \"baafe0bb-787b-4172-8a37-1aff17823994\") " pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.283499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hm8\" (UniqueName: \"kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8\") pod \"auto-csr-approver-29591314-9xsqr\" (UID: \"baafe0bb-787b-4172-8a37-1aff17823994\") " pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.484512 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:00 crc kubenswrapper[4790]: I0406 12:34:00.919992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591314-9xsqr"] Apr 06 12:34:01 crc kubenswrapper[4790]: I0406 12:34:01.962590 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" event={"ID":"baafe0bb-787b-4172-8a37-1aff17823994","Type":"ContainerStarted","Data":"3d5ee0d2aa728ec2e839f4a4090c12b5ea4d91a3a9c41f64511b5c13717005b5"} Apr 06 12:34:02 crc kubenswrapper[4790]: I0406 12:34:02.972965 4790 generic.go:334] "Generic (PLEG): container finished" podID="baafe0bb-787b-4172-8a37-1aff17823994" containerID="667cd17bd0dd7101687e51f74ca3f2a573d033f91e3e2e5a6a28c7dea6b0431d" exitCode=0 Apr 06 12:34:02 crc kubenswrapper[4790]: I0406 12:34:02.973157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" event={"ID":"baafe0bb-787b-4172-8a37-1aff17823994","Type":"ContainerDied","Data":"667cd17bd0dd7101687e51f74ca3f2a573d033f91e3e2e5a6a28c7dea6b0431d"} Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.374994 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.560352 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48hm8\" (UniqueName: \"kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8\") pod \"baafe0bb-787b-4172-8a37-1aff17823994\" (UID: \"baafe0bb-787b-4172-8a37-1aff17823994\") " Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.565950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8" (OuterVolumeSpecName: "kube-api-access-48hm8") pod "baafe0bb-787b-4172-8a37-1aff17823994" (UID: "baafe0bb-787b-4172-8a37-1aff17823994"). InnerVolumeSpecName "kube-api-access-48hm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.663139 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48hm8\" (UniqueName: \"kubernetes.io/projected/baafe0bb-787b-4172-8a37-1aff17823994-kube-api-access-48hm8\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.993205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" event={"ID":"baafe0bb-787b-4172-8a37-1aff17823994","Type":"ContainerDied","Data":"3d5ee0d2aa728ec2e839f4a4090c12b5ea4d91a3a9c41f64511b5c13717005b5"} Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.993259 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5ee0d2aa728ec2e839f4a4090c12b5ea4d91a3a9c41f64511b5c13717005b5" Apr 06 12:34:04 crc kubenswrapper[4790]: I0406 12:34:04.993334 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591314-9xsqr" Apr 06 12:34:05 crc kubenswrapper[4790]: I0406 12:34:05.491857 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591308-sl5rd"] Apr 06 12:34:05 crc kubenswrapper[4790]: I0406 12:34:05.503874 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591308-sl5rd"] Apr 06 12:34:05 crc kubenswrapper[4790]: I0406 12:34:05.691129 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38044cee-b6ff-4431-9681-cd498da4823c" path="/var/lib/kubelet/pods/38044cee-b6ff-4431-9681-cd498da4823c/volumes" Apr 06 12:34:09 crc kubenswrapper[4790]: I0406 12:34:09.753752 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:34:09 crc kubenswrapper[4790]: I0406 12:34:09.754173 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:34:09 crc kubenswrapper[4790]: I0406 12:34:09.754245 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:34:09 crc kubenswrapper[4790]: I0406 12:34:09.755312 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:34:09 crc kubenswrapper[4790]: I0406 12:34:09.755410 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79" gracePeriod=600 Apr 06 12:34:10 crc kubenswrapper[4790]: I0406 12:34:10.302459 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79" exitCode=0 Apr 06 12:34:10 crc kubenswrapper[4790]: I0406 12:34:10.303034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79"} Apr 06 12:34:10 crc kubenswrapper[4790]: I0406 12:34:10.303064 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab"} Apr 06 12:34:10 crc kubenswrapper[4790]: I0406 12:34:10.303083 4790 scope.go:117] "RemoveContainer" containerID="3a66133504d8ceb550a1ca6eacf010dc8ef82e8a9a3e52a736a83b59a8824435" Apr 06 12:34:15 crc kubenswrapper[4790]: I0406 12:34:15.382643 4790 scope.go:117] "RemoveContainer" containerID="7150faec12f09400984e5d5cd9c429cd2d07623f925eb434687a8b511868e3dc" Apr 06 12:34:41 crc kubenswrapper[4790]: I0406 12:34:41.625701 4790 generic.go:334] "Generic (PLEG): container finished" podID="9a366552-b5b9-4a9c-92a2-8b63981f5520" containerID="8b9976bada092fed82cdefdca82ef0dd6e6d1be0abffdf1a752ba137de7688c0" exitCode=0 Apr 06 12:34:41 crc kubenswrapper[4790]: I0406 12:34:41.625784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" event={"ID":"9a366552-b5b9-4a9c-92a2-8b63981f5520","Type":"ContainerDied","Data":"8b9976bada092fed82cdefdca82ef0dd6e6d1be0abffdf1a752ba137de7688c0"} Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.082975 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095684 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095750 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45547\" (UniqueName: \"kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095780 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095841 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.095939 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam\") pod \"9a366552-b5b9-4a9c-92a2-8b63981f5520\" (UID: \"9a366552-b5b9-4a9c-92a2-8b63981f5520\") " Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.102650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547" (OuterVolumeSpecName: "kube-api-access-45547") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "kube-api-access-45547". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.102868 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.132025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.133635 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.146365 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory" (OuterVolumeSpecName: "inventory") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.146976 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9a366552-b5b9-4a9c-92a2-8b63981f5520" (UID: "9a366552-b5b9-4a9c-92a2-8b63981f5520"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197351 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45547\" (UniqueName: \"kubernetes.io/projected/9a366552-b5b9-4a9c-92a2-8b63981f5520-kube-api-access-45547\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197400 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197414 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197425 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197434 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.197444 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a366552-b5b9-4a9c-92a2-8b63981f5520-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.649151 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" event={"ID":"9a366552-b5b9-4a9c-92a2-8b63981f5520","Type":"ContainerDied","Data":"961da23755254fad934ee10b61aa50d95c9dc9a60e2037193e9468dda4d1a47d"} Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.649206 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="961da23755254fad934ee10b61aa50d95c9dc9a60e2037193e9468dda4d1a47d" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.649223 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.756179 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz"] Apr 06 12:34:43 crc kubenswrapper[4790]: E0406 12:34:43.756673 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baafe0bb-787b-4172-8a37-1aff17823994" containerName="oc" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.756695 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="baafe0bb-787b-4172-8a37-1aff17823994" containerName="oc" Apr 06 12:34:43 crc kubenswrapper[4790]: E0406 12:34:43.756736 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a366552-b5b9-4a9c-92a2-8b63981f5520" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.756746 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a366552-b5b9-4a9c-92a2-8b63981f5520" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.756986 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="baafe0bb-787b-4172-8a37-1aff17823994" containerName="oc" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.757013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a366552-b5b9-4a9c-92a2-8b63981f5520" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.757792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.762466 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.762679 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.763312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.763490 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.767618 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.783065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz"] Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.808133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.808599 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.808680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.808782 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.809035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f2vg\" (UniqueName: \"kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.911116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f2vg\" (UniqueName: \"kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.911209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.911310 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.911395 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.911449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.919452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.919485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.919938 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.920568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:43 crc kubenswrapper[4790]: I0406 12:34:43.929649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f2vg\" (UniqueName: \"kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-48qfz\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:44 crc kubenswrapper[4790]: I0406 12:34:44.077444 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:34:44 crc kubenswrapper[4790]: I0406 12:34:44.643101 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz"] Apr 06 12:34:44 crc kubenswrapper[4790]: I0406 12:34:44.658320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" event={"ID":"75d5d9f7-8482-4fb7-a536-55656709bec2","Type":"ContainerStarted","Data":"ce98702e5e0b20344c23f0fba729f511eca47f956e1b13bb9db3c0f49a5941c5"} Apr 06 12:34:45 crc kubenswrapper[4790]: I0406 12:34:45.671307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" event={"ID":"75d5d9f7-8482-4fb7-a536-55656709bec2","Type":"ContainerStarted","Data":"616947385a0604863e0841214e22470b85a2db71ec3d9d4a12db28bbb59b21f4"} Apr 06 12:34:45 crc kubenswrapper[4790]: I0406 12:34:45.697033 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" podStartSLOduration=2.292614577 podStartE2EDuration="2.69700857s" podCreationTimestamp="2026-04-06 12:34:43 +0000 UTC" firstStartedPulling="2026-04-06 12:34:44.643126186 +0000 UTC m=+2263.630869052" lastFinishedPulling="2026-04-06 12:34:45.047520179 +0000 UTC m=+2264.035263045" observedRunningTime="2026-04-06 12:34:45.692472941 +0000 UTC m=+2264.680215817" watchObservedRunningTime="2026-04-06 12:34:45.69700857 +0000 UTC m=+2264.684751436" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.141424 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591316-29nr6"] Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.143680 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.145857 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.146169 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.146755 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.159708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591316-29nr6"] Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.320355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvwx\" (UniqueName: \"kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx\") pod \"auto-csr-approver-29591316-29nr6\" (UID: \"3585fa13-fcd0-4414-8b98-2f34f7716f85\") " pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.423357 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvwx\" (UniqueName: \"kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx\") pod \"auto-csr-approver-29591316-29nr6\" (UID: \"3585fa13-fcd0-4414-8b98-2f34f7716f85\") " pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.448745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvwx\" (UniqueName: \"kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx\") pod \"auto-csr-approver-29591316-29nr6\" (UID: \"3585fa13-fcd0-4414-8b98-2f34f7716f85\") " pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.470789 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:00 crc kubenswrapper[4790]: I0406 12:36:00.935186 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591316-29nr6"] Apr 06 12:36:01 crc kubenswrapper[4790]: I0406 12:36:01.354318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591316-29nr6" event={"ID":"3585fa13-fcd0-4414-8b98-2f34f7716f85","Type":"ContainerStarted","Data":"ba90c5b0c20a041e9c8557e18c40f6f7da2503f3eae56cc3d30ae88c603b6abb"} Apr 06 12:36:03 crc kubenswrapper[4790]: I0406 12:36:03.394152 4790 generic.go:334] "Generic (PLEG): container finished" podID="3585fa13-fcd0-4414-8b98-2f34f7716f85" containerID="47e66bc24e2538f5d11cb18b13e5512b059a5cbe4070cf1dac04302c457712df" exitCode=0 Apr 06 12:36:03 crc kubenswrapper[4790]: I0406 12:36:03.394436 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591316-29nr6" event={"ID":"3585fa13-fcd0-4414-8b98-2f34f7716f85","Type":"ContainerDied","Data":"47e66bc24e2538f5d11cb18b13e5512b059a5cbe4070cf1dac04302c457712df"} Apr 06 12:36:04 crc kubenswrapper[4790]: I0406 12:36:04.777040 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:04 crc kubenswrapper[4790]: I0406 12:36:04.818538 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvwx\" (UniqueName: \"kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx\") pod \"3585fa13-fcd0-4414-8b98-2f34f7716f85\" (UID: \"3585fa13-fcd0-4414-8b98-2f34f7716f85\") " Apr 06 12:36:04 crc kubenswrapper[4790]: I0406 12:36:04.824691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx" (OuterVolumeSpecName: "kube-api-access-xpvwx") pod "3585fa13-fcd0-4414-8b98-2f34f7716f85" (UID: "3585fa13-fcd0-4414-8b98-2f34f7716f85"). InnerVolumeSpecName "kube-api-access-xpvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:36:04 crc kubenswrapper[4790]: I0406 12:36:04.920294 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvwx\" (UniqueName: \"kubernetes.io/projected/3585fa13-fcd0-4414-8b98-2f34f7716f85-kube-api-access-xpvwx\") on node \"crc\" DevicePath \"\"" Apr 06 12:36:05 crc kubenswrapper[4790]: I0406 12:36:05.414600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591316-29nr6" event={"ID":"3585fa13-fcd0-4414-8b98-2f34f7716f85","Type":"ContainerDied","Data":"ba90c5b0c20a041e9c8557e18c40f6f7da2503f3eae56cc3d30ae88c603b6abb"} Apr 06 12:36:05 crc kubenswrapper[4790]: I0406 12:36:05.414636 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba90c5b0c20a041e9c8557e18c40f6f7da2503f3eae56cc3d30ae88c603b6abb" Apr 06 12:36:05 crc kubenswrapper[4790]: I0406 12:36:05.414671 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591316-29nr6" Apr 06 12:36:05 crc kubenswrapper[4790]: I0406 12:36:05.858136 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591310-ws5pl"] Apr 06 12:36:05 crc kubenswrapper[4790]: I0406 12:36:05.866766 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591310-ws5pl"] Apr 06 12:36:07 crc kubenswrapper[4790]: I0406 12:36:07.689443 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2f37ed-b06f-41fd-a7c8-75dc97797570" path="/var/lib/kubelet/pods/4e2f37ed-b06f-41fd-a7c8-75dc97797570/volumes" Apr 06 12:36:15 crc kubenswrapper[4790]: I0406 12:36:15.526536 4790 scope.go:117] "RemoveContainer" containerID="84e0b799ed805c54b484b868eb01db86ef3ee6fad0c0bf2a7dc9f559fc3ae4b6" Apr 06 12:36:39 crc kubenswrapper[4790]: I0406 12:36:39.753450 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:36:39 crc kubenswrapper[4790]: I0406 12:36:39.753898 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:37:09 crc kubenswrapper[4790]: I0406 12:37:09.753337 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:37:09 crc kubenswrapper[4790]: I0406 12:37:09.754053 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:37:39 crc kubenswrapper[4790]: I0406 12:37:39.753412 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:37:39 crc kubenswrapper[4790]: I0406 12:37:39.754897 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:37:39 crc kubenswrapper[4790]: I0406 12:37:39.755039 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:37:39 crc kubenswrapper[4790]: I0406 12:37:39.756000 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:37:39 crc kubenswrapper[4790]: I0406 12:37:39.756199 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" gracePeriod=600 Apr 06 12:37:39 crc kubenswrapper[4790]: E0406 12:37:39.874869 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:37:40 crc kubenswrapper[4790]: I0406 12:37:40.347727 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" exitCode=0 Apr 06 12:37:40 crc kubenswrapper[4790]: I0406 12:37:40.347771 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab"} Apr 06 12:37:40 crc kubenswrapper[4790]: I0406 12:37:40.347809 4790 scope.go:117] "RemoveContainer" containerID="68fbfcd66871d7ee71b97cab5c0acf8682c2b58d064fbd4573573516aaae2e79" Apr 06 12:37:40 crc kubenswrapper[4790]: I0406 12:37:40.348468 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:37:40 crc kubenswrapper[4790]: E0406 12:37:40.348949 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:37:51 crc kubenswrapper[4790]: I0406 12:37:51.681240 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:37:51 crc kubenswrapper[4790]: E0406 12:37:51.682886 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.151969 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591318-47zgd"] Apr 06 12:38:00 crc kubenswrapper[4790]: E0406 12:38:00.152902 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3585fa13-fcd0-4414-8b98-2f34f7716f85" containerName="oc" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.152917 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3585fa13-fcd0-4414-8b98-2f34f7716f85" containerName="oc" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.153198 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3585fa13-fcd0-4414-8b98-2f34f7716f85" containerName="oc" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.155140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.157806 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.158201 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.160744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.166382 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591318-47zgd"] Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.339785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzj8w\" (UniqueName: \"kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w\") pod \"auto-csr-approver-29591318-47zgd\" (UID: \"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd\") " pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.441324 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzj8w\" (UniqueName: \"kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w\") pod \"auto-csr-approver-29591318-47zgd\" (UID: \"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd\") " pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.468964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzj8w\" (UniqueName: \"kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w\") pod \"auto-csr-approver-29591318-47zgd\" (UID: \"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd\") " pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.498928 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.963089 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591318-47zgd"] Apr 06 12:38:00 crc kubenswrapper[4790]: I0406 12:38:00.964849 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:38:01 crc kubenswrapper[4790]: I0406 12:38:01.553134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591318-47zgd" event={"ID":"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd","Type":"ContainerStarted","Data":"52b3d812026cb98deaf5b7571ad0ad27d6816d086008ead4f0f398551b3d1f52"} Apr 06 12:38:03 crc kubenswrapper[4790]: I0406 12:38:03.582094 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" containerID="d6df92234f7624c3fb55825bee561fe4575ae43fb8a0dac659ab3ea3c296302e" exitCode=0 Apr 06 12:38:03 crc kubenswrapper[4790]: I0406 12:38:03.582564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591318-47zgd" event={"ID":"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd","Type":"ContainerDied","Data":"d6df92234f7624c3fb55825bee561fe4575ae43fb8a0dac659ab3ea3c296302e"} Apr 06 12:38:04 crc kubenswrapper[4790]: I0406 12:38:04.961675 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.136991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzj8w\" (UniqueName: \"kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w\") pod \"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd\" (UID: \"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd\") " Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.146218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w" (OuterVolumeSpecName: "kube-api-access-mzj8w") pod "e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" (UID: "e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd"). InnerVolumeSpecName "kube-api-access-mzj8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.239072 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzj8w\" (UniqueName: \"kubernetes.io/projected/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd-kube-api-access-mzj8w\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.615516 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591318-47zgd" event={"ID":"e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd","Type":"ContainerDied","Data":"52b3d812026cb98deaf5b7571ad0ad27d6816d086008ead4f0f398551b3d1f52"} Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.615556 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b3d812026cb98deaf5b7571ad0ad27d6816d086008ead4f0f398551b3d1f52" Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.616873 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591318-47zgd" Apr 06 12:38:05 crc kubenswrapper[4790]: I0406 12:38:05.676432 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:38:05 crc kubenswrapper[4790]: E0406 12:38:05.677096 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:38:06 crc kubenswrapper[4790]: I0406 12:38:06.028289 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591312-ppb7c"] Apr 06 12:38:06 crc kubenswrapper[4790]: I0406 12:38:06.037578 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591312-ppb7c"] Apr 06 12:38:07 crc kubenswrapper[4790]: I0406 12:38:07.705618 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6911cb4d-2dc4-40d9-8459-2b335752b5cb" path="/var/lib/kubelet/pods/6911cb4d-2dc4-40d9-8459-2b335752b5cb/volumes" Apr 06 12:38:15 crc kubenswrapper[4790]: I0406 12:38:15.640181 4790 scope.go:117] "RemoveContainer" containerID="0c80affad409c38d4bcaff0e4a853f4847312868952ecdc69d05e2a7f3e261f5" Apr 06 12:38:18 crc kubenswrapper[4790]: I0406 12:38:18.675360 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:38:18 crc kubenswrapper[4790]: E0406 12:38:18.675920 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:38:31 crc kubenswrapper[4790]: I0406 12:38:31.684088 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:38:31 crc kubenswrapper[4790]: E0406 12:38:31.685042 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:38:32 crc kubenswrapper[4790]: I0406 12:38:32.920083 4790 generic.go:334] "Generic (PLEG): container finished" podID="75d5d9f7-8482-4fb7-a536-55656709bec2" containerID="616947385a0604863e0841214e22470b85a2db71ec3d9d4a12db28bbb59b21f4" exitCode=0 Apr 06 12:38:32 crc kubenswrapper[4790]: I0406 12:38:32.920143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" event={"ID":"75d5d9f7-8482-4fb7-a536-55656709bec2","Type":"ContainerDied","Data":"616947385a0604863e0841214e22470b85a2db71ec3d9d4a12db28bbb59b21f4"} Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.356747 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.475711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f2vg\" (UniqueName: \"kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg\") pod \"75d5d9f7-8482-4fb7-a536-55656709bec2\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.475757 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory\") pod \"75d5d9f7-8482-4fb7-a536-55656709bec2\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.475812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam\") pod \"75d5d9f7-8482-4fb7-a536-55656709bec2\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.475858 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle\") pod \"75d5d9f7-8482-4fb7-a536-55656709bec2\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.475940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0\") pod \"75d5d9f7-8482-4fb7-a536-55656709bec2\" (UID: \"75d5d9f7-8482-4fb7-a536-55656709bec2\") " Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.481913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "75d5d9f7-8482-4fb7-a536-55656709bec2" (UID: "75d5d9f7-8482-4fb7-a536-55656709bec2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.491905 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg" (OuterVolumeSpecName: "kube-api-access-8f2vg") pod "75d5d9f7-8482-4fb7-a536-55656709bec2" (UID: "75d5d9f7-8482-4fb7-a536-55656709bec2"). InnerVolumeSpecName "kube-api-access-8f2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.506578 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "75d5d9f7-8482-4fb7-a536-55656709bec2" (UID: "75d5d9f7-8482-4fb7-a536-55656709bec2"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.507135 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "75d5d9f7-8482-4fb7-a536-55656709bec2" (UID: "75d5d9f7-8482-4fb7-a536-55656709bec2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.508001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory" (OuterVolumeSpecName: "inventory") pod "75d5d9f7-8482-4fb7-a536-55656709bec2" (UID: "75d5d9f7-8482-4fb7-a536-55656709bec2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.580430 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.580468 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f2vg\" (UniqueName: \"kubernetes.io/projected/75d5d9f7-8482-4fb7-a536-55656709bec2-kube-api-access-8f2vg\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.580479 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.580487 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.580501 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d5d9f7-8482-4fb7-a536-55656709bec2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.945984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" event={"ID":"75d5d9f7-8482-4fb7-a536-55656709bec2","Type":"ContainerDied","Data":"ce98702e5e0b20344c23f0fba729f511eca47f956e1b13bb9db3c0f49a5941c5"} Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.946237 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce98702e5e0b20344c23f0fba729f511eca47f956e1b13bb9db3c0f49a5941c5" Apr 06 12:38:34 crc kubenswrapper[4790]: I0406 12:38:34.946077 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-48qfz" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.051999 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v"] Apr 06 12:38:35 crc kubenswrapper[4790]: E0406 12:38:35.052435 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d5d9f7-8482-4fb7-a536-55656709bec2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.052455 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d5d9f7-8482-4fb7-a536-55656709bec2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 06 12:38:35 crc kubenswrapper[4790]: E0406 12:38:35.052464 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" containerName="oc" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.052471 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" containerName="oc" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.052666 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d5d9f7-8482-4fb7-a536-55656709bec2" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.052679 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" containerName="oc" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.053307 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.054723 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.055550 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.055667 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.055958 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.057714 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.057814 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.063467 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.080543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v"] Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193531 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjnz4\" (UniqueName: \"kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193677 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193741 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.193757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.295416 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.295786 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjnz4\" (UniqueName: \"kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.295921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.296034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.296158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.296250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.296918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.297462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.297577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.297678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.297778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.297579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.302612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.302794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.303222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.303439 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.303581 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.303699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.303800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.305456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.313425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjnz4\" (UniqueName: \"kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.313951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-w9z4v\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.378770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.900634 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v"] Apr 06 12:38:35 crc kubenswrapper[4790]: I0406 12:38:35.956921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" event={"ID":"9be120eb-568b-4ab4-af61-b92818e7e6ad","Type":"ContainerStarted","Data":"ab87f262a8034f4b64c8088411ee295e1768a0de93481b1aa6b39efd8724109d"} Apr 06 12:38:36 crc kubenswrapper[4790]: I0406 12:38:36.968437 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" event={"ID":"9be120eb-568b-4ab4-af61-b92818e7e6ad","Type":"ContainerStarted","Data":"9b117d3923aef01fff9295b10e9a6e7216daaa05a6c7b06dcbc6b65f0ff9f49c"} Apr 06 12:38:36 crc kubenswrapper[4790]: I0406 12:38:36.997481 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" podStartSLOduration=1.617505371 podStartE2EDuration="1.997458016s" podCreationTimestamp="2026-04-06 12:38:35 +0000 UTC" firstStartedPulling="2026-04-06 12:38:35.905453871 +0000 UTC m=+2494.893196737" lastFinishedPulling="2026-04-06 12:38:36.285406516 +0000 UTC m=+2495.273149382" observedRunningTime="2026-04-06 12:38:36.99166451 +0000 UTC m=+2495.979407386" watchObservedRunningTime="2026-04-06 12:38:36.997458016 +0000 UTC m=+2495.985200882" Apr 06 12:38:46 crc kubenswrapper[4790]: I0406 12:38:46.675055 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:38:46 crc kubenswrapper[4790]: E0406 12:38:46.676053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:38:57 crc kubenswrapper[4790]: I0406 12:38:57.677565 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:38:57 crc kubenswrapper[4790]: E0406 12:38:57.678548 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:39:12 crc kubenswrapper[4790]: I0406 12:39:12.676386 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:39:12 crc kubenswrapper[4790]: E0406 12:39:12.677177 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:39:24 crc kubenswrapper[4790]: I0406 12:39:24.676275 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:39:24 crc kubenswrapper[4790]: E0406 12:39:24.677069 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:39:38 crc kubenswrapper[4790]: I0406 12:39:38.676096 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:39:38 crc kubenswrapper[4790]: E0406 12:39:38.677937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:39:52 crc kubenswrapper[4790]: I0406 12:39:52.676142 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:39:52 crc kubenswrapper[4790]: E0406 12:39:52.678290 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.146378 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591320-flf5j"] Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.149213 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.151803 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.152072 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.156142 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.160749 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591320-flf5j"] Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.172323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5xx\" (UniqueName: \"kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx\") pod \"auto-csr-approver-29591320-flf5j\" (UID: \"a6920e81-7fd5-418a-966e-6b81a157ffa1\") " pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.274280 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5xx\" (UniqueName: \"kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx\") pod \"auto-csr-approver-29591320-flf5j\" (UID: \"a6920e81-7fd5-418a-966e-6b81a157ffa1\") " pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.298821 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5xx\" (UniqueName: \"kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx\") pod \"auto-csr-approver-29591320-flf5j\" (UID: \"a6920e81-7fd5-418a-966e-6b81a157ffa1\") " pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.475069 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:00 crc kubenswrapper[4790]: I0406 12:40:00.958052 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591320-flf5j"] Apr 06 12:40:01 crc kubenswrapper[4790]: I0406 12:40:01.857644 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591320-flf5j" event={"ID":"a6920e81-7fd5-418a-966e-6b81a157ffa1","Type":"ContainerStarted","Data":"5ca5936e588620772654dd15a4a863d302b3e52c7b1c3c6f6a55ac7bc3ea0a17"} Apr 06 12:40:02 crc kubenswrapper[4790]: I0406 12:40:02.869942 4790 generic.go:334] "Generic (PLEG): container finished" podID="a6920e81-7fd5-418a-966e-6b81a157ffa1" containerID="84c256f14e93386c207552bf2d69080015e53340a4a7d8d893de6b41c0794f2a" exitCode=0 Apr 06 12:40:02 crc kubenswrapper[4790]: I0406 12:40:02.870192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591320-flf5j" event={"ID":"a6920e81-7fd5-418a-966e-6b81a157ffa1","Type":"ContainerDied","Data":"84c256f14e93386c207552bf2d69080015e53340a4a7d8d893de6b41c0794f2a"} Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.296363 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.370293 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w5xx\" (UniqueName: \"kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx\") pod \"a6920e81-7fd5-418a-966e-6b81a157ffa1\" (UID: \"a6920e81-7fd5-418a-966e-6b81a157ffa1\") " Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.377120 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx" (OuterVolumeSpecName: "kube-api-access-2w5xx") pod "a6920e81-7fd5-418a-966e-6b81a157ffa1" (UID: "a6920e81-7fd5-418a-966e-6b81a157ffa1"). InnerVolumeSpecName "kube-api-access-2w5xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.472881 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w5xx\" (UniqueName: \"kubernetes.io/projected/a6920e81-7fd5-418a-966e-6b81a157ffa1-kube-api-access-2w5xx\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.892535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591320-flf5j" event={"ID":"a6920e81-7fd5-418a-966e-6b81a157ffa1","Type":"ContainerDied","Data":"5ca5936e588620772654dd15a4a863d302b3e52c7b1c3c6f6a55ac7bc3ea0a17"} Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.892912 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca5936e588620772654dd15a4a863d302b3e52c7b1c3c6f6a55ac7bc3ea0a17" Apr 06 12:40:04 crc kubenswrapper[4790]: I0406 12:40:04.892620 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591320-flf5j" Apr 06 12:40:05 crc kubenswrapper[4790]: I0406 12:40:05.364576 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591314-9xsqr"] Apr 06 12:40:05 crc kubenswrapper[4790]: I0406 12:40:05.375155 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591314-9xsqr"] Apr 06 12:40:05 crc kubenswrapper[4790]: I0406 12:40:05.676377 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:40:05 crc kubenswrapper[4790]: E0406 12:40:05.676663 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:05 crc kubenswrapper[4790]: I0406 12:40:05.688518 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baafe0bb-787b-4172-8a37-1aff17823994" path="/var/lib/kubelet/pods/baafe0bb-787b-4172-8a37-1aff17823994/volumes" Apr 06 12:40:15 crc kubenswrapper[4790]: I0406 12:40:15.767498 4790 scope.go:117] "RemoveContainer" containerID="667cd17bd0dd7101687e51f74ca3f2a573d033f91e3e2e5a6a28c7dea6b0431d" Apr 06 12:40:17 crc kubenswrapper[4790]: I0406 12:40:17.676067 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:40:17 crc kubenswrapper[4790]: E0406 12:40:17.676538 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:28 crc kubenswrapper[4790]: I0406 12:40:28.675026 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:40:28 crc kubenswrapper[4790]: E0406 12:40:28.675716 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:43 crc kubenswrapper[4790]: I0406 12:40:43.675788 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:40:43 crc kubenswrapper[4790]: E0406 12:40:43.676599 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:54 crc kubenswrapper[4790]: I0406 12:40:54.386023 4790 generic.go:334] "Generic (PLEG): container finished" podID="9be120eb-568b-4ab4-af61-b92818e7e6ad" containerID="9b117d3923aef01fff9295b10e9a6e7216daaa05a6c7b06dcbc6b65f0ff9f49c" exitCode=0 Apr 06 12:40:54 crc kubenswrapper[4790]: I0406 12:40:54.386522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" event={"ID":"9be120eb-568b-4ab4-af61-b92818e7e6ad","Type":"ContainerDied","Data":"9b117d3923aef01fff9295b10e9a6e7216daaa05a6c7b06dcbc6b65f0ff9f49c"} Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.675572 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:40:55 crc kubenswrapper[4790]: E0406 12:40:55.676097 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.889300 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957382 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957774 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957843 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957894 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjnz4\" (UniqueName: \"kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.957942 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.958922 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.958965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.959044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.959093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3\") pod \"9be120eb-568b-4ab4-af61-b92818e7e6ad\" (UID: \"9be120eb-568b-4ab4-af61-b92818e7e6ad\") " Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.981311 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.981363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4" (OuterVolumeSpecName: "kube-api-access-mjnz4") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "kube-api-access-mjnz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.988211 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.988381 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.993536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.994727 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.994785 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.995667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:55 crc kubenswrapper[4790]: I0406 12:40:55.996090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.007588 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.014860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory" (OuterVolumeSpecName: "inventory") pod "9be120eb-568b-4ab4-af61-b92818e7e6ad" (UID: "9be120eb-568b-4ab4-af61-b92818e7e6ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061379 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061425 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061442 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061457 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061473 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061486 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061501 4790 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061512 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061525 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061536 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9be120eb-568b-4ab4-af61-b92818e7e6ad-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.061547 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjnz4\" (UniqueName: \"kubernetes.io/projected/9be120eb-568b-4ab4-af61-b92818e7e6ad-kube-api-access-mjnz4\") on node \"crc\" DevicePath \"\"" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.335215 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:40:56 crc kubenswrapper[4790]: E0406 12:40:56.336030 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6920e81-7fd5-418a-966e-6b81a157ffa1" containerName="oc" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.336054 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6920e81-7fd5-418a-966e-6b81a157ffa1" containerName="oc" Apr 06 12:40:56 crc kubenswrapper[4790]: E0406 12:40:56.336083 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9be120eb-568b-4ab4-af61-b92818e7e6ad" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.336116 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be120eb-568b-4ab4-af61-b92818e7e6ad" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.336477 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6920e81-7fd5-418a-966e-6b81a157ffa1" containerName="oc" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.336515 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9be120eb-568b-4ab4-af61-b92818e7e6ad" containerName="nova-edpm-deployment-openstack-edpm-ipam" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.338339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.367213 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.367317 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdzhc\" (UniqueName: \"kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.367381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.369194 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.404134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" event={"ID":"9be120eb-568b-4ab4-af61-b92818e7e6ad","Type":"ContainerDied","Data":"ab87f262a8034f4b64c8088411ee295e1768a0de93481b1aa6b39efd8724109d"} Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.404183 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab87f262a8034f4b64c8088411ee295e1768a0de93481b1aa6b39efd8724109d" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.404194 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-w9z4v" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.468771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdzhc\" (UniqueName: \"kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.468871 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.468947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.469420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.469634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.489052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdzhc\" (UniqueName: \"kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc\") pod \"certified-operators-zp9dd\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.514019 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg"] Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.515866 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.522087 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.522174 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.522218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.522843 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-sdq29" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.524264 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.527625 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg"] Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571581 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hx4\" (UniqueName: \"kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.571834 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.572126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.666217 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67hx4\" (UniqueName: \"kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.674904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.679403 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.679768 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.680612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.685550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.686342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.687090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.698421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hx4\" (UniqueName: \"kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-clrlg\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:56 crc kubenswrapper[4790]: I0406 12:40:56.870275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:40:57 crc kubenswrapper[4790]: I0406 12:40:57.221538 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:40:57 crc kubenswrapper[4790]: I0406 12:40:57.414031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerStarted","Data":"aec0af56dc195663bc662adce2eda553b11fd408b5ab244d90b0b4d385f960a6"} Apr 06 12:40:57 crc kubenswrapper[4790]: I0406 12:40:57.451349 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg"] Apr 06 12:40:57 crc kubenswrapper[4790]: W0406 12:40:57.511084 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1da0dd0_f14d_4b72_8308_a256f237732f.slice/crio-1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c WatchSource:0}: Error finding container 1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c: Status 404 returned error can't find the container with id 1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c Apr 06 12:40:58 crc kubenswrapper[4790]: I0406 12:40:58.427249 4790 generic.go:334] "Generic (PLEG): container finished" podID="7d3be496-8472-4397-aa55-91af9e207903" containerID="f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1" exitCode=0 Apr 06 12:40:58 crc kubenswrapper[4790]: I0406 12:40:58.427433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerDied","Data":"f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1"} Apr 06 12:40:58 crc kubenswrapper[4790]: I0406 12:40:58.433230 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" event={"ID":"b1da0dd0-f14d-4b72-8308-a256f237732f","Type":"ContainerStarted","Data":"14d477ce8204b4186eea243bcf515e1c6b8fb71f7709202f6e79b66065bc05bf"} Apr 06 12:40:58 crc kubenswrapper[4790]: I0406 12:40:58.433287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" event={"ID":"b1da0dd0-f14d-4b72-8308-a256f237732f","Type":"ContainerStarted","Data":"1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c"} Apr 06 12:40:58 crc kubenswrapper[4790]: I0406 12:40:58.478240 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" podStartSLOduration=2.074756516 podStartE2EDuration="2.478213024s" podCreationTimestamp="2026-04-06 12:40:56 +0000 UTC" firstStartedPulling="2026-04-06 12:40:57.513338096 +0000 UTC m=+2636.501080962" lastFinishedPulling="2026-04-06 12:40:57.916794604 +0000 UTC m=+2636.904537470" observedRunningTime="2026-04-06 12:40:58.463703826 +0000 UTC m=+2637.451446692" watchObservedRunningTime="2026-04-06 12:40:58.478213024 +0000 UTC m=+2637.465955900" Apr 06 12:40:59 crc kubenswrapper[4790]: I0406 12:40:59.444108 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerStarted","Data":"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75"} Apr 06 12:41:00 crc kubenswrapper[4790]: I0406 12:41:00.457854 4790 generic.go:334] "Generic (PLEG): container finished" podID="7d3be496-8472-4397-aa55-91af9e207903" containerID="5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75" exitCode=0 Apr 06 12:41:00 crc kubenswrapper[4790]: I0406 12:41:00.457964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerDied","Data":"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75"} Apr 06 12:41:01 crc kubenswrapper[4790]: I0406 12:41:01.468300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerStarted","Data":"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847"} Apr 06 12:41:01 crc kubenswrapper[4790]: I0406 12:41:01.490114 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zp9dd" podStartSLOduration=3.011528891 podStartE2EDuration="5.490096741s" podCreationTimestamp="2026-04-06 12:40:56 +0000 UTC" firstStartedPulling="2026-04-06 12:40:58.432377896 +0000 UTC m=+2637.420120762" lastFinishedPulling="2026-04-06 12:41:00.910945746 +0000 UTC m=+2639.898688612" observedRunningTime="2026-04-06 12:41:01.483541725 +0000 UTC m=+2640.471284581" watchObservedRunningTime="2026-04-06 12:41:01.490096741 +0000 UTC m=+2640.477839607" Apr 06 12:41:06 crc kubenswrapper[4790]: I0406 12:41:06.666997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:06 crc kubenswrapper[4790]: I0406 12:41:06.667489 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:06 crc kubenswrapper[4790]: I0406 12:41:06.675480 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:41:06 crc kubenswrapper[4790]: E0406 12:41:06.675715 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:41:06 crc kubenswrapper[4790]: I0406 12:41:06.708101 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:07 crc kubenswrapper[4790]: I0406 12:41:07.793746 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:07 crc kubenswrapper[4790]: I0406 12:41:07.858242 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:41:09 crc kubenswrapper[4790]: I0406 12:41:09.759667 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zp9dd" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="registry-server" containerID="cri-o://dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847" gracePeriod=2 Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.266306 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.292647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities\") pod \"7d3be496-8472-4397-aa55-91af9e207903\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.292741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdzhc\" (UniqueName: \"kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc\") pod \"7d3be496-8472-4397-aa55-91af9e207903\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.292869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content\") pod \"7d3be496-8472-4397-aa55-91af9e207903\" (UID: \"7d3be496-8472-4397-aa55-91af9e207903\") " Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.294786 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities" (OuterVolumeSpecName: "utilities") pod "7d3be496-8472-4397-aa55-91af9e207903" (UID: "7d3be496-8472-4397-aa55-91af9e207903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.301302 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc" (OuterVolumeSpecName: "kube-api-access-bdzhc") pod "7d3be496-8472-4397-aa55-91af9e207903" (UID: "7d3be496-8472-4397-aa55-91af9e207903"). InnerVolumeSpecName "kube-api-access-bdzhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.395469 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.395496 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdzhc\" (UniqueName: \"kubernetes.io/projected/7d3be496-8472-4397-aa55-91af9e207903-kube-api-access-bdzhc\") on node \"crc\" DevicePath \"\"" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.671727 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d3be496-8472-4397-aa55-91af9e207903" (UID: "7d3be496-8472-4397-aa55-91af9e207903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.700127 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3be496-8472-4397-aa55-91af9e207903-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.771050 4790 generic.go:334] "Generic (PLEG): container finished" podID="7d3be496-8472-4397-aa55-91af9e207903" containerID="dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847" exitCode=0 Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.771102 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerDied","Data":"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847"} Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.771136 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zp9dd" event={"ID":"7d3be496-8472-4397-aa55-91af9e207903","Type":"ContainerDied","Data":"aec0af56dc195663bc662adce2eda553b11fd408b5ab244d90b0b4d385f960a6"} Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.771157 4790 scope.go:117] "RemoveContainer" containerID="dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.771517 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zp9dd" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.809663 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.820274 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zp9dd"] Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.827269 4790 scope.go:117] "RemoveContainer" containerID="5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.857893 4790 scope.go:117] "RemoveContainer" containerID="f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.900219 4790 scope.go:117] "RemoveContainer" containerID="dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847" Apr 06 12:41:10 crc kubenswrapper[4790]: E0406 12:41:10.900776 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847\": container with ID starting with dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847 not found: ID does not exist" containerID="dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.900835 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847"} err="failed to get container status \"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847\": rpc error: code = NotFound desc = could not find container \"dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847\": container with ID starting with dc31d5af37c110416095234014693b153aba6013fbe724c8acc0cff3cbcba847 not found: ID does not exist" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.900868 4790 scope.go:117] "RemoveContainer" containerID="5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75" Apr 06 12:41:10 crc kubenswrapper[4790]: E0406 12:41:10.901467 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75\": container with ID starting with 5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75 not found: ID does not exist" containerID="5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.901505 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75"} err="failed to get container status \"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75\": rpc error: code = NotFound desc = could not find container \"5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75\": container with ID starting with 5ec11a9fcc7ed6c372ce0f12efa09ae2d73b8b87d8df51646c03cf4c12ba7f75 not found: ID does not exist" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.901528 4790 scope.go:117] "RemoveContainer" containerID="f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1" Apr 06 12:41:10 crc kubenswrapper[4790]: E0406 12:41:10.901786 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1\": container with ID starting with f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1 not found: ID does not exist" containerID="f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1" Apr 06 12:41:10 crc kubenswrapper[4790]: I0406 12:41:10.901850 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1"} err="failed to get container status \"f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1\": rpc error: code = NotFound desc = could not find container \"f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1\": container with ID starting with f1c9ae65bff9694b61f754d729620b0c14aa5be6f61c5ee7f5687cf0aea6f9a1 not found: ID does not exist" Apr 06 12:41:11 crc kubenswrapper[4790]: I0406 12:41:11.687111 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3be496-8472-4397-aa55-91af9e207903" path="/var/lib/kubelet/pods/7d3be496-8472-4397-aa55-91af9e207903/volumes" Apr 06 12:41:20 crc kubenswrapper[4790]: I0406 12:41:20.676245 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:41:20 crc kubenswrapper[4790]: E0406 12:41:20.677087 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:41:35 crc kubenswrapper[4790]: I0406 12:41:35.676070 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:41:35 crc kubenswrapper[4790]: E0406 12:41:35.677032 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:41:49 crc kubenswrapper[4790]: I0406 12:41:49.676252 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:41:49 crc kubenswrapper[4790]: E0406 12:41:49.676958 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.155888 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591322-jn5fr"] Apr 06 12:42:00 crc kubenswrapper[4790]: E0406 12:42:00.157474 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="registry-server" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.157492 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="registry-server" Apr 06 12:42:00 crc kubenswrapper[4790]: E0406 12:42:00.157522 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="extract-content" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.157529 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="extract-content" Apr 06 12:42:00 crc kubenswrapper[4790]: E0406 12:42:00.157555 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="extract-utilities" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.157564 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="extract-utilities" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.158038 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3be496-8472-4397-aa55-91af9e207903" containerName="registry-server" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.161505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.167606 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.167727 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.167885 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.170882 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591322-jn5fr"] Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.239319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxbk\" (UniqueName: \"kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk\") pod \"auto-csr-approver-29591322-jn5fr\" (UID: \"ceddc141-ac18-43f1-992b-c926d228d935\") " pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.342156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxbk\" (UniqueName: \"kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk\") pod \"auto-csr-approver-29591322-jn5fr\" (UID: \"ceddc141-ac18-43f1-992b-c926d228d935\") " pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.398147 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxbk\" (UniqueName: \"kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk\") pod \"auto-csr-approver-29591322-jn5fr\" (UID: \"ceddc141-ac18-43f1-992b-c926d228d935\") " pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.492232 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:00 crc kubenswrapper[4790]: I0406 12:42:00.992450 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591322-jn5fr"] Apr 06 12:42:01 crc kubenswrapper[4790]: I0406 12:42:01.312486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" event={"ID":"ceddc141-ac18-43f1-992b-c926d228d935","Type":"ContainerStarted","Data":"7f06c6eaac81d652fc995cfac0c479e817b26f60bf75597e368b99c82928aba4"} Apr 06 12:42:03 crc kubenswrapper[4790]: I0406 12:42:03.330687 4790 generic.go:334] "Generic (PLEG): container finished" podID="ceddc141-ac18-43f1-992b-c926d228d935" containerID="d0ac2cb5794a4855b19caf0710434fbf2f549022d5da222fbe496e7b7350adf5" exitCode=0 Apr 06 12:42:03 crc kubenswrapper[4790]: I0406 12:42:03.331016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" event={"ID":"ceddc141-ac18-43f1-992b-c926d228d935","Type":"ContainerDied","Data":"d0ac2cb5794a4855b19caf0710434fbf2f549022d5da222fbe496e7b7350adf5"} Apr 06 12:42:04 crc kubenswrapper[4790]: I0406 12:42:04.675488 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:42:04 crc kubenswrapper[4790]: E0406 12:42:04.676016 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:42:04 crc kubenswrapper[4790]: I0406 12:42:04.738236 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:04 crc kubenswrapper[4790]: I0406 12:42:04.842408 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxbk\" (UniqueName: \"kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk\") pod \"ceddc141-ac18-43f1-992b-c926d228d935\" (UID: \"ceddc141-ac18-43f1-992b-c926d228d935\") " Apr 06 12:42:04 crc kubenswrapper[4790]: I0406 12:42:04.849124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk" (OuterVolumeSpecName: "kube-api-access-5wxbk") pod "ceddc141-ac18-43f1-992b-c926d228d935" (UID: "ceddc141-ac18-43f1-992b-c926d228d935"). InnerVolumeSpecName "kube-api-access-5wxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:42:04 crc kubenswrapper[4790]: I0406 12:42:04.945023 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxbk\" (UniqueName: \"kubernetes.io/projected/ceddc141-ac18-43f1-992b-c926d228d935-kube-api-access-5wxbk\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:05 crc kubenswrapper[4790]: I0406 12:42:05.351337 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" event={"ID":"ceddc141-ac18-43f1-992b-c926d228d935","Type":"ContainerDied","Data":"7f06c6eaac81d652fc995cfac0c479e817b26f60bf75597e368b99c82928aba4"} Apr 06 12:42:05 crc kubenswrapper[4790]: I0406 12:42:05.351373 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f06c6eaac81d652fc995cfac0c479e817b26f60bf75597e368b99c82928aba4" Apr 06 12:42:05 crc kubenswrapper[4790]: I0406 12:42:05.351423 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591322-jn5fr" Apr 06 12:42:05 crc kubenswrapper[4790]: I0406 12:42:05.819674 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591316-29nr6"] Apr 06 12:42:05 crc kubenswrapper[4790]: I0406 12:42:05.831797 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591316-29nr6"] Apr 06 12:42:07 crc kubenswrapper[4790]: I0406 12:42:07.687416 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3585fa13-fcd0-4414-8b98-2f34f7716f85" path="/var/lib/kubelet/pods/3585fa13-fcd0-4414-8b98-2f34f7716f85/volumes" Apr 06 12:42:15 crc kubenswrapper[4790]: I0406 12:42:15.889247 4790 scope.go:117] "RemoveContainer" containerID="47e66bc24e2538f5d11cb18b13e5512b059a5cbe4070cf1dac04302c457712df" Apr 06 12:42:16 crc kubenswrapper[4790]: I0406 12:42:16.675776 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:42:16 crc kubenswrapper[4790]: E0406 12:42:16.676054 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.790028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:21 crc kubenswrapper[4790]: E0406 12:42:21.791256 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceddc141-ac18-43f1-992b-c926d228d935" containerName="oc" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.791272 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceddc141-ac18-43f1-992b-c926d228d935" containerName="oc" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.791497 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceddc141-ac18-43f1-992b-c926d228d935" containerName="oc" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.793394 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.834978 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.897934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.898193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mz8\" (UniqueName: \"kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:21 crc kubenswrapper[4790]: I0406 12:42:21.898541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.000879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.000987 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.001066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mz8\" (UniqueName: \"kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.001652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.001701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.021050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mz8\" (UniqueName: \"kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8\") pod \"redhat-operators-qn6lh\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.117995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:22 crc kubenswrapper[4790]: I0406 12:42:22.662688 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:23 crc kubenswrapper[4790]: I0406 12:42:23.518907 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerID="532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358" exitCode=0 Apr 06 12:42:23 crc kubenswrapper[4790]: I0406 12:42:23.519089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerDied","Data":"532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358"} Apr 06 12:42:23 crc kubenswrapper[4790]: I0406 12:42:23.519189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerStarted","Data":"96cc14ccfa74d68c5e43c07490cd9544ae3c442b32194be00c0615dc057f4b48"} Apr 06 12:42:25 crc kubenswrapper[4790]: I0406 12:42:25.540891 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerID="18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e" exitCode=0 Apr 06 12:42:25 crc kubenswrapper[4790]: I0406 12:42:25.541071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerDied","Data":"18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e"} Apr 06 12:42:26 crc kubenswrapper[4790]: I0406 12:42:26.566342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerStarted","Data":"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4"} Apr 06 12:42:26 crc kubenswrapper[4790]: I0406 12:42:26.594927 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qn6lh" podStartSLOduration=3.177105071 podStartE2EDuration="5.594907918s" podCreationTimestamp="2026-04-06 12:42:21 +0000 UTC" firstStartedPulling="2026-04-06 12:42:23.520655288 +0000 UTC m=+2722.508398154" lastFinishedPulling="2026-04-06 12:42:25.938458115 +0000 UTC m=+2724.926201001" observedRunningTime="2026-04-06 12:42:26.592288268 +0000 UTC m=+2725.580031134" watchObservedRunningTime="2026-04-06 12:42:26.594907918 +0000 UTC m=+2725.582650784" Apr 06 12:42:27 crc kubenswrapper[4790]: I0406 12:42:27.675625 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:42:27 crc kubenswrapper[4790]: E0406 12:42:27.676156 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:42:32 crc kubenswrapper[4790]: I0406 12:42:32.118407 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:32 crc kubenswrapper[4790]: I0406 12:42:32.119123 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:32 crc kubenswrapper[4790]: I0406 12:42:32.167648 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:32 crc kubenswrapper[4790]: I0406 12:42:32.664932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:33 crc kubenswrapper[4790]: I0406 12:42:33.963053 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:34 crc kubenswrapper[4790]: I0406 12:42:34.634232 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qn6lh" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="registry-server" containerID="cri-o://2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4" gracePeriod=2 Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.136517 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.267155 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6mz8\" (UniqueName: \"kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8\") pod \"a0e1a238-34db-4178-b2dd-6d6992b24071\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.267255 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities\") pod \"a0e1a238-34db-4178-b2dd-6d6992b24071\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.267282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content\") pod \"a0e1a238-34db-4178-b2dd-6d6992b24071\" (UID: \"a0e1a238-34db-4178-b2dd-6d6992b24071\") " Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.268148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities" (OuterVolumeSpecName: "utilities") pod "a0e1a238-34db-4178-b2dd-6d6992b24071" (UID: "a0e1a238-34db-4178-b2dd-6d6992b24071"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.272808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8" (OuterVolumeSpecName: "kube-api-access-p6mz8") pod "a0e1a238-34db-4178-b2dd-6d6992b24071" (UID: "a0e1a238-34db-4178-b2dd-6d6992b24071"). InnerVolumeSpecName "kube-api-access-p6mz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.369283 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.369314 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6mz8\" (UniqueName: \"kubernetes.io/projected/a0e1a238-34db-4178-b2dd-6d6992b24071-kube-api-access-p6mz8\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.397511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0e1a238-34db-4178-b2dd-6d6992b24071" (UID: "a0e1a238-34db-4178-b2dd-6d6992b24071"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.471318 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e1a238-34db-4178-b2dd-6d6992b24071-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.646268 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerID="2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4" exitCode=0 Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.646305 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerDied","Data":"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4"} Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.646333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn6lh" event={"ID":"a0e1a238-34db-4178-b2dd-6d6992b24071","Type":"ContainerDied","Data":"96cc14ccfa74d68c5e43c07490cd9544ae3c442b32194be00c0615dc057f4b48"} Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.646333 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn6lh" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.646348 4790 scope.go:117] "RemoveContainer" containerID="2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.678348 4790 scope.go:117] "RemoveContainer" containerID="18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.696656 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.702674 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qn6lh"] Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.712420 4790 scope.go:117] "RemoveContainer" containerID="532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.760909 4790 scope.go:117] "RemoveContainer" containerID="2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4" Apr 06 12:42:35 crc kubenswrapper[4790]: E0406 12:42:35.761872 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4\": container with ID starting with 2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4 not found: ID does not exist" containerID="2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.761900 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4"} err="failed to get container status \"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4\": rpc error: code = NotFound desc = could not find container \"2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4\": container with ID starting with 2766e67a063015dee7194525304954e5dd87b5806cc691f285264a15d754c3e4 not found: ID does not exist" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.761921 4790 scope.go:117] "RemoveContainer" containerID="18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e" Apr 06 12:42:35 crc kubenswrapper[4790]: E0406 12:42:35.762373 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e\": container with ID starting with 18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e not found: ID does not exist" containerID="18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.762401 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e"} err="failed to get container status \"18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e\": rpc error: code = NotFound desc = could not find container \"18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e\": container with ID starting with 18554885c489f96013a32d6f2907557f51a590b70e91ef692723aab637a4494e not found: ID does not exist" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.762424 4790 scope.go:117] "RemoveContainer" containerID="532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358" Apr 06 12:42:35 crc kubenswrapper[4790]: E0406 12:42:35.763163 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358\": container with ID starting with 532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358 not found: ID does not exist" containerID="532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358" Apr 06 12:42:35 crc kubenswrapper[4790]: I0406 12:42:35.763192 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358"} err="failed to get container status \"532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358\": rpc error: code = NotFound desc = could not find container \"532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358\": container with ID starting with 532adc74096758f7b66a6c02002f344bf9b376ed0452de22b390eaf4ca44f358 not found: ID does not exist" Apr 06 12:42:37 crc kubenswrapper[4790]: I0406 12:42:37.706331 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" path="/var/lib/kubelet/pods/a0e1a238-34db-4178-b2dd-6d6992b24071/volumes" Apr 06 12:42:39 crc kubenswrapper[4790]: I0406 12:42:39.675912 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:42:39 crc kubenswrapper[4790]: E0406 12:42:39.676576 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.226316 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:42:49 crc kubenswrapper[4790]: E0406 12:42:49.227440 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="extract-content" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.227455 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="extract-content" Apr 06 12:42:49 crc kubenswrapper[4790]: E0406 12:42:49.227468 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="registry-server" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.227474 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="registry-server" Apr 06 12:42:49 crc kubenswrapper[4790]: E0406 12:42:49.227500 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="extract-utilities" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.227507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="extract-utilities" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.227701 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e1a238-34db-4178-b2dd-6d6992b24071" containerName="registry-server" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.229223 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.240604 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.282235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27dl\" (UniqueName: \"kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.282320 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.282402 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.384933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m27dl\" (UniqueName: \"kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.385005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.385065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.385546 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.386507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.404528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27dl\" (UniqueName: \"kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl\") pod \"redhat-marketplace-jnph5\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:49 crc kubenswrapper[4790]: I0406 12:42:49.596196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:50 crc kubenswrapper[4790]: I0406 12:42:50.117901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:42:50 crc kubenswrapper[4790]: I0406 12:42:50.792867 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerID="2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97" exitCode=0 Apr 06 12:42:50 crc kubenswrapper[4790]: I0406 12:42:50.792916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerDied","Data":"2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97"} Apr 06 12:42:50 crc kubenswrapper[4790]: I0406 12:42:50.792941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerStarted","Data":"f392f04354bcad2da1eb366faf06e72a734fb0f294dee94e53bf75ae584fd68a"} Apr 06 12:42:51 crc kubenswrapper[4790]: I0406 12:42:51.804399 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerID="2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e" exitCode=0 Apr 06 12:42:51 crc kubenswrapper[4790]: I0406 12:42:51.804504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerDied","Data":"2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e"} Apr 06 12:42:52 crc kubenswrapper[4790]: I0406 12:42:52.826817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerStarted","Data":"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e"} Apr 06 12:42:52 crc kubenswrapper[4790]: I0406 12:42:52.831264 4790 generic.go:334] "Generic (PLEG): container finished" podID="b1da0dd0-f14d-4b72-8308-a256f237732f" containerID="14d477ce8204b4186eea243bcf515e1c6b8fb71f7709202f6e79b66065bc05bf" exitCode=0 Apr 06 12:42:52 crc kubenswrapper[4790]: I0406 12:42:52.831300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" event={"ID":"b1da0dd0-f14d-4b72-8308-a256f237732f","Type":"ContainerDied","Data":"14d477ce8204b4186eea243bcf515e1c6b8fb71f7709202f6e79b66065bc05bf"} Apr 06 12:42:52 crc kubenswrapper[4790]: I0406 12:42:52.852557 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnph5" podStartSLOduration=2.482669422 podStartE2EDuration="3.852531942s" podCreationTimestamp="2026-04-06 12:42:49 +0000 UTC" firstStartedPulling="2026-04-06 12:42:50.794738436 +0000 UTC m=+2749.782481302" lastFinishedPulling="2026-04-06 12:42:52.164600916 +0000 UTC m=+2751.152343822" observedRunningTime="2026-04-06 12:42:52.844803915 +0000 UTC m=+2751.832546791" watchObservedRunningTime="2026-04-06 12:42:52.852531942 +0000 UTC m=+2751.840274808" Apr 06 12:42:53 crc kubenswrapper[4790]: I0406 12:42:53.677846 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.322975 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381527 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381689 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hx4\" (UniqueName: \"kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381708 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381738 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381796 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.381819 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle\") pod \"b1da0dd0-f14d-4b72-8308-a256f237732f\" (UID: \"b1da0dd0-f14d-4b72-8308-a256f237732f\") " Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.389352 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4" (OuterVolumeSpecName: "kube-api-access-67hx4") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "kube-api-access-67hx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.401701 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.419808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.421135 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.422594 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory" (OuterVolumeSpecName: "inventory") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.441303 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.442401 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1da0dd0-f14d-4b72-8308-a256f237732f" (UID: "b1da0dd0-f14d-4b72-8308-a256f237732f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484558 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484613 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484629 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-inventory\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484640 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484650 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67hx4\" (UniqueName: \"kubernetes.io/projected/b1da0dd0-f14d-4b72-8308-a256f237732f-kube-api-access-67hx4\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484660 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.484671 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1da0dd0-f14d-4b72-8308-a256f237732f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.851693 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.851672 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-clrlg" event={"ID":"b1da0dd0-f14d-4b72-8308-a256f237732f","Type":"ContainerDied","Data":"1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c"} Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.851776 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1013f32d1859172659ff0712f23280e473bbbb1437874a872dcc66cb8d818f8c" Apr 06 12:42:54 crc kubenswrapper[4790]: I0406 12:42:54.854326 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8"} Apr 06 12:42:59 crc kubenswrapper[4790]: I0406 12:42:59.596414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:59 crc kubenswrapper[4790]: I0406 12:42:59.597219 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:59 crc kubenswrapper[4790]: I0406 12:42:59.644265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:42:59 crc kubenswrapper[4790]: I0406 12:42:59.964428 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:43:00 crc kubenswrapper[4790]: I0406 12:43:00.028780 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:43:01 crc kubenswrapper[4790]: I0406 12:43:01.925864 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jnph5" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="registry-server" containerID="cri-o://43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e" gracePeriod=2 Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.389686 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.480970 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m27dl\" (UniqueName: \"kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl\") pod \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.481144 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities\") pod \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.481257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content\") pod \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\" (UID: \"eb48a358-b502-4d8c-8a4d-415bfe45fbce\") " Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.482075 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities" (OuterVolumeSpecName: "utilities") pod "eb48a358-b502-4d8c-8a4d-415bfe45fbce" (UID: "eb48a358-b502-4d8c-8a4d-415bfe45fbce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.486570 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl" (OuterVolumeSpecName: "kube-api-access-m27dl") pod "eb48a358-b502-4d8c-8a4d-415bfe45fbce" (UID: "eb48a358-b502-4d8c-8a4d-415bfe45fbce"). InnerVolumeSpecName "kube-api-access-m27dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.509214 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb48a358-b502-4d8c-8a4d-415bfe45fbce" (UID: "eb48a358-b502-4d8c-8a4d-415bfe45fbce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.583714 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.583743 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb48a358-b502-4d8c-8a4d-415bfe45fbce-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.583755 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m27dl\" (UniqueName: \"kubernetes.io/projected/eb48a358-b502-4d8c-8a4d-415bfe45fbce-kube-api-access-m27dl\") on node \"crc\" DevicePath \"\"" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.939406 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerID="43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e" exitCode=0 Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.939449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerDied","Data":"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e"} Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.939478 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnph5" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.939504 4790 scope.go:117] "RemoveContainer" containerID="43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.939492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnph5" event={"ID":"eb48a358-b502-4d8c-8a4d-415bfe45fbce","Type":"ContainerDied","Data":"f392f04354bcad2da1eb366faf06e72a734fb0f294dee94e53bf75ae584fd68a"} Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.962189 4790 scope.go:117] "RemoveContainer" containerID="2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e" Apr 06 12:43:02 crc kubenswrapper[4790]: I0406 12:43:02.988000 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.000664 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnph5"] Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.013414 4790 scope.go:117] "RemoveContainer" containerID="2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.041222 4790 scope.go:117] "RemoveContainer" containerID="43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e" Apr 06 12:43:03 crc kubenswrapper[4790]: E0406 12:43:03.041787 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e\": container with ID starting with 43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e not found: ID does not exist" containerID="43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.041819 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e"} err="failed to get container status \"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e\": rpc error: code = NotFound desc = could not find container \"43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e\": container with ID starting with 43d37f30acdb4124b2751fb24c0d6366e49fc53abcc372f26cab6ff2bf0eed1e not found: ID does not exist" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.041856 4790 scope.go:117] "RemoveContainer" containerID="2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e" Apr 06 12:43:03 crc kubenswrapper[4790]: E0406 12:43:03.042173 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e\": container with ID starting with 2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e not found: ID does not exist" containerID="2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.042219 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e"} err="failed to get container status \"2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e\": rpc error: code = NotFound desc = could not find container \"2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e\": container with ID starting with 2c1b56cc96647b2f040b89bce9e0ab110f95b94f3af3f667534bdc36e0092f5e not found: ID does not exist" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.042252 4790 scope.go:117] "RemoveContainer" containerID="2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97" Apr 06 12:43:03 crc kubenswrapper[4790]: E0406 12:43:03.042712 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97\": container with ID starting with 2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97 not found: ID does not exist" containerID="2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.042738 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97"} err="failed to get container status \"2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97\": rpc error: code = NotFound desc = could not find container \"2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97\": container with ID starting with 2ff4912c2b9e4316c80bb917ec5a357fe45cbc6defb56f8990540066a3d1bf97 not found: ID does not exist" Apr 06 12:43:03 crc kubenswrapper[4790]: I0406 12:43:03.690899 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" path="/var/lib/kubelet/pods/eb48a358-b502-4d8c-8a4d-415bfe45fbce/volumes" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.537221 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: E0406 12:43:27.538392 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="extract-utilities" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538411 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="extract-utilities" Apr 06 12:43:27 crc kubenswrapper[4790]: E0406 12:43:27.538436 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1da0dd0-f14d-4b72-8308-a256f237732f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538446 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1da0dd0-f14d-4b72-8308-a256f237732f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 06 12:43:27 crc kubenswrapper[4790]: E0406 12:43:27.538465 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="extract-content" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538473 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="extract-content" Apr 06 12:43:27 crc kubenswrapper[4790]: E0406 12:43:27.538486 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="registry-server" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538495 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="registry-server" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538748 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1da0dd0-f14d-4b72-8308-a256f237732f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.538771 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb48a358-b502-4d8c-8a4d-415bfe45fbce" containerName="registry-server" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.540173 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.543814 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.557283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.629917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-scripts\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630004 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630039 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-run\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630378 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-sys\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpsx\" (UniqueName: \"kubernetes.io/projected/cd529dba-04e1-45bf-9a0a-69fd93502cd9-kube-api-access-vkpsx\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630570 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-lib-modules\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.630731 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-dev\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.642310 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.646966 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.658666 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.696377 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.700156 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.704880 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.708041 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.733413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.733821 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.733971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.734077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-run\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.734213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.734417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.734553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-run\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-sys\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735639 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpsx\" (UniqueName: \"kubernetes.io/projected/cd529dba-04e1-45bf-9a0a-69fd93502cd9-kube-api-access-vkpsx\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.735976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.736098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.736929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-run\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.736945 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.736999 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-sys\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.737085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.737157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.740734 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjldn\" (UniqueName: \"kubernetes.io/projected/f77070f6-164c-4bec-aafa-6126ca005702-kube-api-access-jjldn\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.740982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.741174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-lib-modules\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.741337 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.741586 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-dev\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.750355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-dev\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.750593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.750754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752019 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-scripts\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-sys\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.752867 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.753383 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.746528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.750929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-dev\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.746093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.754189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.746498 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.754364 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.746058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd529dba-04e1-45bf-9a0a-69fd93502cd9-lib-modules\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.759873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-scripts\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.761472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpsx\" (UniqueName: \"kubernetes.io/projected/cd529dba-04e1-45bf-9a0a-69fd93502cd9-kube-api-access-vkpsx\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.770306 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd529dba-04e1-45bf-9a0a-69fd93502cd9-config-data\") pod \"cinder-backup-0\" (UID: \"cd529dba-04e1-45bf-9a0a-69fd93502cd9\") " pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.855292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.855681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.855785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.855891 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjldn\" (UniqueName: \"kubernetes.io/projected/f77070f6-164c-4bec-aafa-6126ca005702-kube-api-access-jjldn\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856023 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856029 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-dev\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856512 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856551 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5cnr\" (UniqueName: \"kubernetes.io/projected/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-kube-api-access-t5cnr\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-dev\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.856970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-sys\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857423 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857736 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857865 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.857927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-sys\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858047 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.858857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859512 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859652 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-run\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.859765 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.860621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f77070f6-164c-4bec-aafa-6126ca005702-run\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.862340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.862493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.864302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.864587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.866228 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f77070f6-164c-4bec-aafa-6126ca005702-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.883594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjldn\" (UniqueName: \"kubernetes.io/projected/f77070f6-164c-4bec-aafa-6126ca005702-kube-api-access-jjldn\") pod \"cinder-volume-nfs-0\" (UID: \"f77070f6-164c-4bec-aafa-6126ca005702\") " pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963754 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5cnr\" (UniqueName: \"kubernetes.io/projected/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-kube-api-access-t5cnr\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963882 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.963923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964165 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964931 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.964941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.965008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.965073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.965309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.965367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.965531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.968604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.968629 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.968849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.971856 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.988944 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:27 crc kubenswrapper[4790]: I0406 12:43:27.990129 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5cnr\" (UniqueName: \"kubernetes.io/projected/da98578a-8aaa-403a-8f8e-4c7115cfa2cb-kube-api-access-t5cnr\") pod \"cinder-volume-nfs-2-0\" (UID: \"da98578a-8aaa-403a-8f8e-4c7115cfa2cb\") " pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:28 crc kubenswrapper[4790]: I0406 12:43:28.038867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:28 crc kubenswrapper[4790]: I0406 12:43:28.475500 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Apr 06 12:43:28 crc kubenswrapper[4790]: I0406 12:43:28.492752 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:43:28 crc kubenswrapper[4790]: I0406 12:43:28.595489 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Apr 06 12:43:28 crc kubenswrapper[4790]: W0406 12:43:28.756946 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf77070f6_164c_4bec_aafa_6126ca005702.slice/crio-4bd61df3b0e9a166a028412c01642d38d881a1051d752e3d4293a43a19c2b119 WatchSource:0}: Error finding container 4bd61df3b0e9a166a028412c01642d38d881a1051d752e3d4293a43a19c2b119: Status 404 returned error can't find the container with id 4bd61df3b0e9a166a028412c01642d38d881a1051d752e3d4293a43a19c2b119 Apr 06 12:43:28 crc kubenswrapper[4790]: I0406 12:43:28.857293 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Apr 06 12:43:28 crc kubenswrapper[4790]: W0406 12:43:28.966593 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda98578a_8aaa_403a_8f8e_4c7115cfa2cb.slice/crio-83e018e9bca28f6d23d494213f29c8dd79bddf4d0777ffe9a71bcc26263a6377 WatchSource:0}: Error finding container 83e018e9bca28f6d23d494213f29c8dd79bddf4d0777ffe9a71bcc26263a6377: Status 404 returned error can't find the container with id 83e018e9bca28f6d23d494213f29c8dd79bddf4d0777ffe9a71bcc26263a6377 Apr 06 12:43:29 crc kubenswrapper[4790]: I0406 12:43:29.226876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cd529dba-04e1-45bf-9a0a-69fd93502cd9","Type":"ContainerStarted","Data":"239f56b2b1b9c4b7d934240bd81969e96944ea77fd22a0e0de84040332363c0d"} Apr 06 12:43:29 crc kubenswrapper[4790]: I0406 12:43:29.228975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"da98578a-8aaa-403a-8f8e-4c7115cfa2cb","Type":"ContainerStarted","Data":"83e018e9bca28f6d23d494213f29c8dd79bddf4d0777ffe9a71bcc26263a6377"} Apr 06 12:43:29 crc kubenswrapper[4790]: I0406 12:43:29.230545 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"f77070f6-164c-4bec-aafa-6126ca005702","Type":"ContainerStarted","Data":"4bd61df3b0e9a166a028412c01642d38d881a1051d752e3d4293a43a19c2b119"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.284067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"da98578a-8aaa-403a-8f8e-4c7115cfa2cb","Type":"ContainerStarted","Data":"84dd395fe61c6139eee71d4565c705e34df5fbb26fe938d84bb7999d733ffbbf"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.284623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"da98578a-8aaa-403a-8f8e-4c7115cfa2cb","Type":"ContainerStarted","Data":"a88aab4781aa75db51418a6b568d16d2c075b3299a07e0ce6fce31f30f53cc71"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.293317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"f77070f6-164c-4bec-aafa-6126ca005702","Type":"ContainerStarted","Data":"ae82a8b1dcfcbfc142f560ff33c63b7645845a916133def9ddaef45e4fdd2c1f"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.293364 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"f77070f6-164c-4bec-aafa-6126ca005702","Type":"ContainerStarted","Data":"de4ffab7f5937046fb3a897732191a74ee7bcb2d70e3403e80e43b24e3352cb0"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.304446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cd529dba-04e1-45bf-9a0a-69fd93502cd9","Type":"ContainerStarted","Data":"12d15406d5c4c8985eeb3f3dc2be56b52abf7140c3d2d47d4c0534c05ab464eb"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.304527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cd529dba-04e1-45bf-9a0a-69fd93502cd9","Type":"ContainerStarted","Data":"fec7b40805bca116c266ffb136a254af0bbad2012cc187e706f424e4a6720f1c"} Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.334099 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.246423822 podStartE2EDuration="3.334077837s" podCreationTimestamp="2026-04-06 12:43:27 +0000 UTC" firstStartedPulling="2026-04-06 12:43:28.970139746 +0000 UTC m=+2787.957882612" lastFinishedPulling="2026-04-06 12:43:29.057793761 +0000 UTC m=+2788.045536627" observedRunningTime="2026-04-06 12:43:30.314548984 +0000 UTC m=+2789.302291860" watchObservedRunningTime="2026-04-06 12:43:30.334077837 +0000 UTC m=+2789.321820703" Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.356708 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.024684778 podStartE2EDuration="3.356682241s" podCreationTimestamp="2026-04-06 12:43:27 +0000 UTC" firstStartedPulling="2026-04-06 12:43:28.492534287 +0000 UTC m=+2787.480277143" lastFinishedPulling="2026-04-06 12:43:28.82453174 +0000 UTC m=+2787.812274606" observedRunningTime="2026-04-06 12:43:30.338664339 +0000 UTC m=+2789.326407205" watchObservedRunningTime="2026-04-06 12:43:30.356682241 +0000 UTC m=+2789.344425107" Apr 06 12:43:30 crc kubenswrapper[4790]: I0406 12:43:30.380536 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.113380561 podStartE2EDuration="3.380513949s" podCreationTimestamp="2026-04-06 12:43:27 +0000 UTC" firstStartedPulling="2026-04-06 12:43:28.759034177 +0000 UTC m=+2787.746777043" lastFinishedPulling="2026-04-06 12:43:29.026167565 +0000 UTC m=+2788.013910431" observedRunningTime="2026-04-06 12:43:30.3689671 +0000 UTC m=+2789.356709976" watchObservedRunningTime="2026-04-06 12:43:30.380513949 +0000 UTC m=+2789.368256815" Apr 06 12:43:32 crc kubenswrapper[4790]: I0406 12:43:32.863497 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Apr 06 12:43:32 crc kubenswrapper[4790]: I0406 12:43:32.990044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Apr 06 12:43:33 crc kubenswrapper[4790]: I0406 12:43:33.041346 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:38 crc kubenswrapper[4790]: I0406 12:43:38.034614 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Apr 06 12:43:38 crc kubenswrapper[4790]: I0406 12:43:38.268437 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Apr 06 12:43:38 crc kubenswrapper[4790]: I0406 12:43:38.294209 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.160051 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591324-mbr26"] Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.161952 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.165062 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.165598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.165771 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.189389 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591324-mbr26"] Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.270594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9xpc\" (UniqueName: \"kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc\") pod \"auto-csr-approver-29591324-mbr26\" (UID: \"bb0cc9f5-3236-42b5-adc2-85cf616a156f\") " pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.375154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9xpc\" (UniqueName: \"kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc\") pod \"auto-csr-approver-29591324-mbr26\" (UID: \"bb0cc9f5-3236-42b5-adc2-85cf616a156f\") " pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.395447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9xpc\" (UniqueName: \"kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc\") pod \"auto-csr-approver-29591324-mbr26\" (UID: \"bb0cc9f5-3236-42b5-adc2-85cf616a156f\") " pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.486758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:00 crc kubenswrapper[4790]: I0406 12:44:00.973428 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591324-mbr26"] Apr 06 12:44:01 crc kubenswrapper[4790]: I0406 12:44:01.624278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591324-mbr26" event={"ID":"bb0cc9f5-3236-42b5-adc2-85cf616a156f","Type":"ContainerStarted","Data":"336cd7983077856ae6a8c6fe430adfce77a53fbf8d08358415e7b653c921e304"} Apr 06 12:44:02 crc kubenswrapper[4790]: I0406 12:44:02.634109 4790 generic.go:334] "Generic (PLEG): container finished" podID="bb0cc9f5-3236-42b5-adc2-85cf616a156f" containerID="5feafff1976253487b3166ec4252b7c533c32366ca9bc0a11eec2c80a8ae8e2b" exitCode=0 Apr 06 12:44:02 crc kubenswrapper[4790]: I0406 12:44:02.634297 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591324-mbr26" event={"ID":"bb0cc9f5-3236-42b5-adc2-85cf616a156f","Type":"ContainerDied","Data":"5feafff1976253487b3166ec4252b7c533c32366ca9bc0a11eec2c80a8ae8e2b"} Apr 06 12:44:03 crc kubenswrapper[4790]: I0406 12:44:03.994012 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.151561 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9xpc\" (UniqueName: \"kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc\") pod \"bb0cc9f5-3236-42b5-adc2-85cf616a156f\" (UID: \"bb0cc9f5-3236-42b5-adc2-85cf616a156f\") " Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.159056 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc" (OuterVolumeSpecName: "kube-api-access-p9xpc") pod "bb0cc9f5-3236-42b5-adc2-85cf616a156f" (UID: "bb0cc9f5-3236-42b5-adc2-85cf616a156f"). InnerVolumeSpecName "kube-api-access-p9xpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.253926 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9xpc\" (UniqueName: \"kubernetes.io/projected/bb0cc9f5-3236-42b5-adc2-85cf616a156f-kube-api-access-p9xpc\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.653606 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591324-mbr26" event={"ID":"bb0cc9f5-3236-42b5-adc2-85cf616a156f","Type":"ContainerDied","Data":"336cd7983077856ae6a8c6fe430adfce77a53fbf8d08358415e7b653c921e304"} Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.653646 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336cd7983077856ae6a8c6fe430adfce77a53fbf8d08358415e7b653c921e304" Apr 06 12:44:04 crc kubenswrapper[4790]: I0406 12:44:04.653886 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591324-mbr26" Apr 06 12:44:05 crc kubenswrapper[4790]: I0406 12:44:05.067307 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591318-47zgd"] Apr 06 12:44:05 crc kubenswrapper[4790]: I0406 12:44:05.076842 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591318-47zgd"] Apr 06 12:44:05 crc kubenswrapper[4790]: I0406 12:44:05.685278 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd" path="/var/lib/kubelet/pods/e0d8b67d-567d-4bd4-a43f-6f1a8ca038bd/volumes" Apr 06 12:44:16 crc kubenswrapper[4790]: I0406 12:44:16.014373 4790 scope.go:117] "RemoveContainer" containerID="d6df92234f7624c3fb55825bee561fe4575ae43fb8a0dac659ab3ea3c296302e" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.848142 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:32 crc kubenswrapper[4790]: E0406 12:44:32.849317 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0cc9f5-3236-42b5-adc2-85cf616a156f" containerName="oc" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.849337 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0cc9f5-3236-42b5-adc2-85cf616a156f" containerName="oc" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.849612 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0cc9f5-3236-42b5-adc2-85cf616a156f" containerName="oc" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.851529 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.872363 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.962974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sf6r\" (UniqueName: \"kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.963294 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:32 crc kubenswrapper[4790]: I0406 12:44:32.963452 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.065681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sf6r\" (UniqueName: \"kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.065864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.065913 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.066517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.066678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.087169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sf6r\" (UniqueName: \"kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r\") pod \"community-operators-tcvcm\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.179593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.838545 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:33 crc kubenswrapper[4790]: I0406 12:44:33.952202 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerStarted","Data":"e0b395c08c73005f0f5949ca53b274bd8021dcf7b84638787a564dcb10474be1"} Apr 06 12:44:34 crc kubenswrapper[4790]: I0406 12:44:34.968570 4790 generic.go:334] "Generic (PLEG): container finished" podID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerID="67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c" exitCode=0 Apr 06 12:44:34 crc kubenswrapper[4790]: I0406 12:44:34.968630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerDied","Data":"67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c"} Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.819368 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.820344 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="prometheus" containerID="cri-o://3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147" gracePeriod=600 Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.820427 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="thanos-sidecar" containerID="cri-o://c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae" gracePeriod=600 Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.820455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="config-reloader" containerID="cri-o://b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40" gracePeriod=600 Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.987411 4790 generic.go:334] "Generic (PLEG): container finished" podID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerID="c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae" exitCode=0 Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.987441 4790 generic.go:334] "Generic (PLEG): container finished" podID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerID="3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147" exitCode=0 Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.987499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerDied","Data":"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae"} Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.987544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerDied","Data":"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147"} Apr 06 12:44:35 crc kubenswrapper[4790]: I0406 12:44:35.989931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerStarted","Data":"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97"} Apr 06 12:44:36 crc kubenswrapper[4790]: I0406 12:44:36.251533 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.156:9090/-/ready\": dial tcp 10.217.0.156:9090: connect: connection refused" Apr 06 12:44:36 crc kubenswrapper[4790]: I0406 12:44:36.953771 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.000169 4790 generic.go:334] "Generic (PLEG): container finished" podID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerID="b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40" exitCode=0 Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.001403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.001940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerDied","Data":"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40"} Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.001983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e348afd2-35e1-4cf9-b84e-a2fbd085648f","Type":"ContainerDied","Data":"69abd3c017a97ef49f302010d66b1dd8da09d3099bc6508437c7eea9d01b8c3f"} Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.002001 4790 scope.go:117] "RemoveContainer" containerID="c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.031879 4790 scope.go:117] "RemoveContainer" containerID="b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.048950 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.048996 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049082 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049168 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfvsp\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049485 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file\") pod \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\" (UID: \"e348afd2-35e1-4cf9-b84e-a2fbd085648f\") " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049541 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.049957 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.050747 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.050877 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.057547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.058432 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out" (OuterVolumeSpecName: "config-out") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.059077 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.060318 4790 scope.go:117] "RemoveContainer" containerID="3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.062706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config" (OuterVolumeSpecName: "config") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.067545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.068457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.069431 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp" (OuterVolumeSpecName: "kube-api-access-dfvsp") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "kube-api-access-dfvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.070063 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.094578 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "pvc-428e608e-3b0f-419c-8722-244ca6b44799". PluginName "kubernetes.io/csi", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.151965 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfvsp\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-kube-api-access-dfvsp\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.151998 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152009 4790 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152030 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" " Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152040 4790 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152052 4790 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config-out\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152060 4790 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e348afd2-35e1-4cf9-b84e-a2fbd085648f-tls-assets\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152068 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152077 4790 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152088 4790 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152097 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e348afd2-35e1-4cf9-b84e-a2fbd085648f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.152655 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config" (OuterVolumeSpecName: "web-config") pod "e348afd2-35e1-4cf9-b84e-a2fbd085648f" (UID: "e348afd2-35e1-4cf9-b84e-a2fbd085648f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.188206 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.188353 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-428e608e-3b0f-419c-8722-244ca6b44799" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799") on node "crc" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.224588 4790 scope.go:117] "RemoveContainer" containerID="d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.244407 4790 scope.go:117] "RemoveContainer" containerID="c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.244980 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae\": container with ID starting with c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae not found: ID does not exist" containerID="c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245023 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae"} err="failed to get container status \"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae\": rpc error: code = NotFound desc = could not find container \"c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae\": container with ID starting with c66b56b84acb60a6da9863a60da3988f8114a8165524347a4ba737ef83e719ae not found: ID does not exist" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245049 4790 scope.go:117] "RemoveContainer" containerID="b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.245344 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40\": container with ID starting with b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40 not found: ID does not exist" containerID="b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245374 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40"} err="failed to get container status \"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40\": rpc error: code = NotFound desc = could not find container \"b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40\": container with ID starting with b4fba8f67b8ccdd9b6ae3878be27df75fbb234e3d6ed541ce85c459f383b3e40 not found: ID does not exist" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245392 4790 scope.go:117] "RemoveContainer" containerID="3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.245671 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147\": container with ID starting with 3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147 not found: ID does not exist" containerID="3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245699 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147"} err="failed to get container status \"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147\": rpc error: code = NotFound desc = could not find container \"3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147\": container with ID starting with 3e5e58ce5e3203387018c09f9b79ac8e12267ee31fbaad6906077de32e228147 not found: ID does not exist" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.245716 4790 scope.go:117] "RemoveContainer" containerID="d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.246113 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512\": container with ID starting with d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512 not found: ID does not exist" containerID="d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.246141 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512"} err="failed to get container status \"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512\": rpc error: code = NotFound desc = could not find container \"d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512\": container with ID starting with d3976d1ea3e3586a9482bb8bab900be1d8cdb3b9d4cf97300daf3edc6189a512 not found: ID does not exist" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.254198 4790 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e348afd2-35e1-4cf9-b84e-a2fbd085648f-web-config\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.254239 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.336816 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.346119 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.366444 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.367083 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="config-reloader" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.367163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="config-reloader" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.371072 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="init-config-reloader" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.371252 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="init-config-reloader" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.371368 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="thanos-sidecar" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.371426 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="thanos-sidecar" Apr 06 12:44:37 crc kubenswrapper[4790]: E0406 12:44:37.371501 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="prometheus" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.371555 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="prometheus" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.372001 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="prometheus" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.372099 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="thanos-sidecar" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.372168 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" containerName="config-reloader" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.374034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.376404 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.376407 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.377040 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.377132 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.377438 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gtk7n" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.377436 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.377695 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.393298 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.406512 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.457851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.457910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.457950 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.457976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458007 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc194d2b-4a4a-4745-8225-7d44efe056ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458216 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnlw\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-kube-api-access-8mnlw\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.458260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560754 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc194d2b-4a4a-4745-8225-7d44efe056ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnlw\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-kube-api-access-8mnlw\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.560938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.561709 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.562517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.563496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc194d2b-4a4a-4745-8225-7d44efe056ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.565588 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.566295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.566711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.567449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc194d2b-4a4a-4745-8225-7d44efe056ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.569244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.569475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.571666 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc194d2b-4a4a-4745-8225-7d44efe056ef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.572687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.577103 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.577147 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2657789c2cd73a331030693f07b47cf0a3bc578270043bd3878090c8357096a6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.579763 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnlw\" (UniqueName: \"kubernetes.io/projected/fc194d2b-4a4a-4745-8225-7d44efe056ef-kube-api-access-8mnlw\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.614458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-428e608e-3b0f-419c-8722-244ca6b44799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-428e608e-3b0f-419c-8722-244ca6b44799\") pod \"prometheus-metric-storage-0\" (UID: \"fc194d2b-4a4a-4745-8225-7d44efe056ef\") " pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.686234 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e348afd2-35e1-4cf9-b84e-a2fbd085648f" path="/var/lib/kubelet/pods/e348afd2-35e1-4cf9-b84e-a2fbd085648f/volumes" Apr 06 12:44:37 crc kubenswrapper[4790]: I0406 12:44:37.695421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Apr 06 12:44:38 crc kubenswrapper[4790]: I0406 12:44:38.033648 4790 generic.go:334] "Generic (PLEG): container finished" podID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerID="b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97" exitCode=0 Apr 06 12:44:38 crc kubenswrapper[4790]: I0406 12:44:38.033729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerDied","Data":"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97"} Apr 06 12:44:38 crc kubenswrapper[4790]: I0406 12:44:38.327011 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Apr 06 12:44:39 crc kubenswrapper[4790]: I0406 12:44:39.045745 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerStarted","Data":"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c"} Apr 06 12:44:39 crc kubenswrapper[4790]: I0406 12:44:39.048579 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerStarted","Data":"520f08afea5cd4307d6fcf71f25ab5cacd3a4ad64ce103b8c0f13d48f68e5e21"} Apr 06 12:44:39 crc kubenswrapper[4790]: I0406 12:44:39.073054 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tcvcm" podStartSLOduration=3.599821183 podStartE2EDuration="7.073033575s" podCreationTimestamp="2026-04-06 12:44:32 +0000 UTC" firstStartedPulling="2026-04-06 12:44:34.970352316 +0000 UTC m=+2853.958095182" lastFinishedPulling="2026-04-06 12:44:38.443564708 +0000 UTC m=+2857.431307574" observedRunningTime="2026-04-06 12:44:39.065420561 +0000 UTC m=+2858.053163427" watchObservedRunningTime="2026-04-06 12:44:39.073033575 +0000 UTC m=+2858.060776441" Apr 06 12:44:42 crc kubenswrapper[4790]: I0406 12:44:42.091525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerStarted","Data":"9438b3710e323f46998ddaaad2debef74d2a834c192dde0c637032d36087beae"} Apr 06 12:44:43 crc kubenswrapper[4790]: I0406 12:44:43.180449 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:43 crc kubenswrapper[4790]: I0406 12:44:43.181230 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:43 crc kubenswrapper[4790]: I0406 12:44:43.235779 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:44 crc kubenswrapper[4790]: I0406 12:44:44.157943 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:44 crc kubenswrapper[4790]: I0406 12:44:44.208968 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.130266 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tcvcm" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="registry-server" containerID="cri-o://f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c" gracePeriod=2 Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.581109 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.735860 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities\") pod \"120b71a4-fdc1-4685-a273-c611cab64bbc\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.736149 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sf6r\" (UniqueName: \"kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r\") pod \"120b71a4-fdc1-4685-a273-c611cab64bbc\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.736256 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content\") pod \"120b71a4-fdc1-4685-a273-c611cab64bbc\" (UID: \"120b71a4-fdc1-4685-a273-c611cab64bbc\") " Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.737280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities" (OuterVolumeSpecName: "utilities") pod "120b71a4-fdc1-4685-a273-c611cab64bbc" (UID: "120b71a4-fdc1-4685-a273-c611cab64bbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.746246 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r" (OuterVolumeSpecName: "kube-api-access-7sf6r") pod "120b71a4-fdc1-4685-a273-c611cab64bbc" (UID: "120b71a4-fdc1-4685-a273-c611cab64bbc"). InnerVolumeSpecName "kube-api-access-7sf6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.799252 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "120b71a4-fdc1-4685-a273-c611cab64bbc" (UID: "120b71a4-fdc1-4685-a273-c611cab64bbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.839216 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sf6r\" (UniqueName: \"kubernetes.io/projected/120b71a4-fdc1-4685-a273-c611cab64bbc-kube-api-access-7sf6r\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.839466 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:46 crc kubenswrapper[4790]: I0406 12:44:46.839532 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/120b71a4-fdc1-4685-a273-c611cab64bbc-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.143538 4790 generic.go:334] "Generic (PLEG): container finished" podID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerID="f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c" exitCode=0 Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.143584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerDied","Data":"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c"} Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.143615 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcvcm" event={"ID":"120b71a4-fdc1-4685-a273-c611cab64bbc","Type":"ContainerDied","Data":"e0b395c08c73005f0f5949ca53b274bd8021dcf7b84638787a564dcb10474be1"} Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.143632 4790 scope.go:117] "RemoveContainer" containerID="f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.145180 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcvcm" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.173240 4790 scope.go:117] "RemoveContainer" containerID="b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.179992 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.189436 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tcvcm"] Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.200103 4790 scope.go:117] "RemoveContainer" containerID="67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.249342 4790 scope.go:117] "RemoveContainer" containerID="f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c" Apr 06 12:44:47 crc kubenswrapper[4790]: E0406 12:44:47.249923 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c\": container with ID starting with f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c not found: ID does not exist" containerID="f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.249954 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c"} err="failed to get container status \"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c\": rpc error: code = NotFound desc = could not find container \"f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c\": container with ID starting with f6db88919df3285a9a8718b7587bb1f135f45fa792b98129e89b678e15fb105c not found: ID does not exist" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.249984 4790 scope.go:117] "RemoveContainer" containerID="b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97" Apr 06 12:44:47 crc kubenswrapper[4790]: E0406 12:44:47.250492 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97\": container with ID starting with b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97 not found: ID does not exist" containerID="b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.250542 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97"} err="failed to get container status \"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97\": rpc error: code = NotFound desc = could not find container \"b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97\": container with ID starting with b0ca5a356cfc9f47ba710e54073251ba7202df1d5b583bb346c5fd526d4bab97 not found: ID does not exist" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.250573 4790 scope.go:117] "RemoveContainer" containerID="67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c" Apr 06 12:44:47 crc kubenswrapper[4790]: E0406 12:44:47.251253 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c\": container with ID starting with 67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c not found: ID does not exist" containerID="67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.251279 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c"} err="failed to get container status \"67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c\": rpc error: code = NotFound desc = could not find container \"67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c\": container with ID starting with 67eb77fa96b48da9a08016b7155f0759ec301775e9929f6a65d78ee6408bfe4c not found: ID does not exist" Apr 06 12:44:47 crc kubenswrapper[4790]: I0406 12:44:47.692565 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" path="/var/lib/kubelet/pods/120b71a4-fdc1-4685-a273-c611cab64bbc/volumes" Apr 06 12:44:49 crc kubenswrapper[4790]: I0406 12:44:49.163090 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc194d2b-4a4a-4745-8225-7d44efe056ef" containerID="9438b3710e323f46998ddaaad2debef74d2a834c192dde0c637032d36087beae" exitCode=0 Apr 06 12:44:49 crc kubenswrapper[4790]: I0406 12:44:49.163137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerDied","Data":"9438b3710e323f46998ddaaad2debef74d2a834c192dde0c637032d36087beae"} Apr 06 12:44:50 crc kubenswrapper[4790]: I0406 12:44:50.175228 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerStarted","Data":"d7b8150ea7b9e45d4e6da4b08513c4474aa34746ab12e3e347364ca1d8370bff"} Apr 06 12:44:53 crc kubenswrapper[4790]: I0406 12:44:53.205247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerStarted","Data":"8ae047908c0e64bfd08c191676a16467f586da74b3b46db3aa123c292280be02"} Apr 06 12:44:53 crc kubenswrapper[4790]: I0406 12:44:53.205537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fc194d2b-4a4a-4745-8225-7d44efe056ef","Type":"ContainerStarted","Data":"ba8a19081c094320d8d245905879737b9d90912a73a4c705cd7373b673ad089d"} Apr 06 12:44:53 crc kubenswrapper[4790]: I0406 12:44:53.234271 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.234255586 podStartE2EDuration="16.234255586s" podCreationTimestamp="2026-04-06 12:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:44:53.231532243 +0000 UTC m=+2872.219275129" watchObservedRunningTime="2026-04-06 12:44:53.234255586 +0000 UTC m=+2872.221998452" Apr 06 12:44:57 crc kubenswrapper[4790]: I0406 12:44:57.695901 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.168325 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6"] Apr 06 12:45:00 crc kubenswrapper[4790]: E0406 12:45:00.169629 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="extract-utilities" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.169652 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="extract-utilities" Apr 06 12:45:00 crc kubenswrapper[4790]: E0406 12:45:00.169688 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="registry-server" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.169727 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="registry-server" Apr 06 12:45:00 crc kubenswrapper[4790]: E0406 12:45:00.169794 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="extract-content" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.169809 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="extract-content" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.170203 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="120b71a4-fdc1-4685-a273-c611cab64bbc" containerName="registry-server" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.171358 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.181805 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6"] Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.219535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.219552 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.348276 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.348691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwcv\" (UniqueName: \"kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.348730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.451395 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.451476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwcv\" (UniqueName: \"kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.451494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.452320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.457577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.471129 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwcv\" (UniqueName: \"kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv\") pod \"collect-profiles-29591325-89jp6\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:00 crc kubenswrapper[4790]: I0406 12:45:00.560959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:01 crc kubenswrapper[4790]: I0406 12:45:01.004068 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6"] Apr 06 12:45:01 crc kubenswrapper[4790]: W0406 12:45:01.011328 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod734777b7_31b8_4736_b3cc_d322f5c3a3dc.slice/crio-f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5 WatchSource:0}: Error finding container f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5: Status 404 returned error can't find the container with id f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5 Apr 06 12:45:01 crc kubenswrapper[4790]: I0406 12:45:01.288984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" event={"ID":"734777b7-31b8-4736-b3cc-d322f5c3a3dc","Type":"ContainerStarted","Data":"e12a4003a0575b5ffbbe7c52c45509c93fa8dfec4e12db787087603be708c1c4"} Apr 06 12:45:01 crc kubenswrapper[4790]: I0406 12:45:01.290470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" event={"ID":"734777b7-31b8-4736-b3cc-d322f5c3a3dc","Type":"ContainerStarted","Data":"f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5"} Apr 06 12:45:01 crc kubenswrapper[4790]: I0406 12:45:01.318594 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" podStartSLOduration=1.318576124 podStartE2EDuration="1.318576124s" podCreationTimestamp="2026-04-06 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 12:45:01.310661523 +0000 UTC m=+2880.298404429" watchObservedRunningTime="2026-04-06 12:45:01.318576124 +0000 UTC m=+2880.306318990" Apr 06 12:45:01 crc kubenswrapper[4790]: E0406 12:45:01.830346 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod734777b7_31b8_4736_b3cc_d322f5c3a3dc.slice/crio-e12a4003a0575b5ffbbe7c52c45509c93fa8dfec4e12db787087603be708c1c4.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:45:02 crc kubenswrapper[4790]: I0406 12:45:02.304887 4790 generic.go:334] "Generic (PLEG): container finished" podID="734777b7-31b8-4736-b3cc-d322f5c3a3dc" containerID="e12a4003a0575b5ffbbe7c52c45509c93fa8dfec4e12db787087603be708c1c4" exitCode=0 Apr 06 12:45:02 crc kubenswrapper[4790]: I0406 12:45:02.304935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" event={"ID":"734777b7-31b8-4736-b3cc-d322f5c3a3dc","Type":"ContainerDied","Data":"e12a4003a0575b5ffbbe7c52c45509c93fa8dfec4e12db787087603be708c1c4"} Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.686568 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.821582 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume\") pod \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.821674 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume\") pod \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.821884 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwcv\" (UniqueName: \"kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv\") pod \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\" (UID: \"734777b7-31b8-4736-b3cc-d322f5c3a3dc\") " Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.824337 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "734777b7-31b8-4736-b3cc-d322f5c3a3dc" (UID: "734777b7-31b8-4736-b3cc-d322f5c3a3dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.828709 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv" (OuterVolumeSpecName: "kube-api-access-6hwcv") pod "734777b7-31b8-4736-b3cc-d322f5c3a3dc" (UID: "734777b7-31b8-4736-b3cc-d322f5c3a3dc"). InnerVolumeSpecName "kube-api-access-6hwcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.830473 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "734777b7-31b8-4736-b3cc-d322f5c3a3dc" (UID: "734777b7-31b8-4736-b3cc-d322f5c3a3dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.924849 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwcv\" (UniqueName: \"kubernetes.io/projected/734777b7-31b8-4736-b3cc-d322f5c3a3dc-kube-api-access-6hwcv\") on node \"crc\" DevicePath \"\"" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.924887 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/734777b7-31b8-4736-b3cc-d322f5c3a3dc-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:45:03 crc kubenswrapper[4790]: I0406 12:45:03.924901 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/734777b7-31b8-4736-b3cc-d322f5c3a3dc-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 12:45:04 crc kubenswrapper[4790]: I0406 12:45:04.330901 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" event={"ID":"734777b7-31b8-4736-b3cc-d322f5c3a3dc","Type":"ContainerDied","Data":"f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5"} Apr 06 12:45:04 crc kubenswrapper[4790]: I0406 12:45:04.330956 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f350aa526733d86b6cdcaf6a771d8fe704226a48362ce84b401cc38aabef40e5" Apr 06 12:45:04 crc kubenswrapper[4790]: I0406 12:45:04.331052 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6" Apr 06 12:45:04 crc kubenswrapper[4790]: I0406 12:45:04.407231 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz"] Apr 06 12:45:04 crc kubenswrapper[4790]: I0406 12:45:04.418645 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591280-pqhhz"] Apr 06 12:45:05 crc kubenswrapper[4790]: I0406 12:45:05.692718 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d509c077-8950-44b6-99fb-7d4ebf81f4da" path="/var/lib/kubelet/pods/d509c077-8950-44b6-99fb-7d4ebf81f4da/volumes" Apr 06 12:45:07 crc kubenswrapper[4790]: I0406 12:45:07.940699 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Apr 06 12:45:07 crc kubenswrapper[4790]: I0406 12:45:07.950155 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Apr 06 12:45:08 crc kubenswrapper[4790]: I0406 12:45:08.379927 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Apr 06 12:45:09 crc kubenswrapper[4790]: I0406 12:45:09.753608 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:45:09 crc kubenswrapper[4790]: I0406 12:45:09.753672 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:45:16 crc kubenswrapper[4790]: I0406 12:45:16.131337 4790 scope.go:117] "RemoveContainer" containerID="8f3fec1b92ce16a9d4b3ddfa7a76e1315fd31ad32d1639449661a27f263cdaf5" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.314238 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Apr 06 12:45:28 crc kubenswrapper[4790]: E0406 12:45:28.316497 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734777b7-31b8-4736-b3cc-d322f5c3a3dc" containerName="collect-profiles" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.316610 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="734777b7-31b8-4736-b3cc-d322f5c3a3dc" containerName="collect-profiles" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.317031 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="734777b7-31b8-4736-b3cc-d322f5c3a3dc" containerName="collect-profiles" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.318052 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.321133 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.321167 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.321302 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.321753 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-72kbb" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.329097 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.375469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.375523 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.375584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d75w\" (UniqueName: \"kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477764 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477806 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.477822 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.478043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.479293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.479615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.484237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d75w\" (UniqueName: \"kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580375 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580745 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.580989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.588604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.588844 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.598783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d75w\" (UniqueName: \"kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.610727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " pod="openstack/tempest-tests-tempest" Apr 06 12:45:28 crc kubenswrapper[4790]: I0406 12:45:28.644071 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 06 12:45:29 crc kubenswrapper[4790]: I0406 12:45:29.117710 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Apr 06 12:45:29 crc kubenswrapper[4790]: I0406 12:45:29.624349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c58fe7b4-f5be-433f-8390-67dd8a62e81b","Type":"ContainerStarted","Data":"ba408e81d04031aeaf4386063a5af55e690500a4f14ce64eaa6ec7a3b34d9d26"} Apr 06 12:45:39 crc kubenswrapper[4790]: I0406 12:45:39.739652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c58fe7b4-f5be-433f-8390-67dd8a62e81b","Type":"ContainerStarted","Data":"3b224b5eb6b48dd29ac414b0ba77ad2f55e7a13fff00ae7f3439bd27bdc0a18e"} Apr 06 12:45:39 crc kubenswrapper[4790]: I0406 12:45:39.753974 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:45:39 crc kubenswrapper[4790]: I0406 12:45:39.754029 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:45:39 crc kubenswrapper[4790]: I0406 12:45:39.761671 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.271753988 podStartE2EDuration="12.761650143s" podCreationTimestamp="2026-04-06 12:45:27 +0000 UTC" firstStartedPulling="2026-04-06 12:45:29.124495204 +0000 UTC m=+2908.112238070" lastFinishedPulling="2026-04-06 12:45:38.614391359 +0000 UTC m=+2917.602134225" observedRunningTime="2026-04-06 12:45:39.760438761 +0000 UTC m=+2918.748181637" watchObservedRunningTime="2026-04-06 12:45:39.761650143 +0000 UTC m=+2918.749392999" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.165936 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591326-vrcnz"] Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.169226 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.171757 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.172976 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.172995 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.197266 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591326-vrcnz"] Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.280273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknsm\" (UniqueName: \"kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm\") pod \"auto-csr-approver-29591326-vrcnz\" (UID: \"4974ec07-5087-408a-9cfe-644f77cae8e1\") " pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.382030 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknsm\" (UniqueName: \"kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm\") pod \"auto-csr-approver-29591326-vrcnz\" (UID: \"4974ec07-5087-408a-9cfe-644f77cae8e1\") " pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.427680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknsm\" (UniqueName: \"kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm\") pod \"auto-csr-approver-29591326-vrcnz\" (UID: \"4974ec07-5087-408a-9cfe-644f77cae8e1\") " pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:00 crc kubenswrapper[4790]: I0406 12:46:00.490788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:01 crc kubenswrapper[4790]: I0406 12:46:01.005330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591326-vrcnz"] Apr 06 12:46:01 crc kubenswrapper[4790]: I0406 12:46:01.969866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" event={"ID":"4974ec07-5087-408a-9cfe-644f77cae8e1","Type":"ContainerStarted","Data":"2921a77a3687e0418e73930d0406e699285c5b09cfbce23ef78064c79c6038a3"} Apr 06 12:46:02 crc kubenswrapper[4790]: I0406 12:46:02.982281 4790 generic.go:334] "Generic (PLEG): container finished" podID="4974ec07-5087-408a-9cfe-644f77cae8e1" containerID="3b6f0ad4a78833c522b406bd16acb5328f7e2ee4c04f8c63b5a0045d902d1e82" exitCode=0 Apr 06 12:46:02 crc kubenswrapper[4790]: I0406 12:46:02.982359 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" event={"ID":"4974ec07-5087-408a-9cfe-644f77cae8e1","Type":"ContainerDied","Data":"3b6f0ad4a78833c522b406bd16acb5328f7e2ee4c04f8c63b5a0045d902d1e82"} Apr 06 12:46:04 crc kubenswrapper[4790]: I0406 12:46:04.371095 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:04 crc kubenswrapper[4790]: I0406 12:46:04.473723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pknsm\" (UniqueName: \"kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm\") pod \"4974ec07-5087-408a-9cfe-644f77cae8e1\" (UID: \"4974ec07-5087-408a-9cfe-644f77cae8e1\") " Apr 06 12:46:04 crc kubenswrapper[4790]: I0406 12:46:04.480099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm" (OuterVolumeSpecName: "kube-api-access-pknsm") pod "4974ec07-5087-408a-9cfe-644f77cae8e1" (UID: "4974ec07-5087-408a-9cfe-644f77cae8e1"). InnerVolumeSpecName "kube-api-access-pknsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:46:04 crc kubenswrapper[4790]: I0406 12:46:04.577125 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pknsm\" (UniqueName: \"kubernetes.io/projected/4974ec07-5087-408a-9cfe-644f77cae8e1-kube-api-access-pknsm\") on node \"crc\" DevicePath \"\"" Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.019548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" event={"ID":"4974ec07-5087-408a-9cfe-644f77cae8e1","Type":"ContainerDied","Data":"2921a77a3687e0418e73930d0406e699285c5b09cfbce23ef78064c79c6038a3"} Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.019586 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2921a77a3687e0418e73930d0406e699285c5b09cfbce23ef78064c79c6038a3" Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.019596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591326-vrcnz" Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.451574 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591320-flf5j"] Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.461262 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591320-flf5j"] Apr 06 12:46:05 crc kubenswrapper[4790]: I0406 12:46:05.688214 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6920e81-7fd5-418a-966e-6b81a157ffa1" path="/var/lib/kubelet/pods/a6920e81-7fd5-418a-966e-6b81a157ffa1/volumes" Apr 06 12:46:09 crc kubenswrapper[4790]: I0406 12:46:09.753991 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:46:09 crc kubenswrapper[4790]: I0406 12:46:09.754586 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:46:09 crc kubenswrapper[4790]: I0406 12:46:09.754641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:46:09 crc kubenswrapper[4790]: I0406 12:46:09.755607 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:46:09 crc kubenswrapper[4790]: I0406 12:46:09.755685 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8" gracePeriod=600 Apr 06 12:46:10 crc kubenswrapper[4790]: I0406 12:46:10.072077 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8" exitCode=0 Apr 06 12:46:10 crc kubenswrapper[4790]: I0406 12:46:10.072146 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8"} Apr 06 12:46:10 crc kubenswrapper[4790]: I0406 12:46:10.072463 4790 scope.go:117] "RemoveContainer" containerID="b49d9d0b8ccfc42c61f7a569e6c6b5effde5ea59afe1788feff8e3933531b0ab" Apr 06 12:46:11 crc kubenswrapper[4790]: I0406 12:46:11.089318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4"} Apr 06 12:46:16 crc kubenswrapper[4790]: I0406 12:46:16.237182 4790 scope.go:117] "RemoveContainer" containerID="84c256f14e93386c207552bf2d69080015e53340a4a7d8d893de6b41c0794f2a" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.152259 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591328-z57td"] Apr 06 12:48:00 crc kubenswrapper[4790]: E0406 12:48:00.153355 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4974ec07-5087-408a-9cfe-644f77cae8e1" containerName="oc" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.153368 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4974ec07-5087-408a-9cfe-644f77cae8e1" containerName="oc" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.153566 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4974ec07-5087-408a-9cfe-644f77cae8e1" containerName="oc" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.154291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.156432 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.157639 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.158713 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.162486 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591328-z57td"] Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.244405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gjs\" (UniqueName: \"kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs\") pod \"auto-csr-approver-29591328-z57td\" (UID: \"e385359c-ece1-46b6-be5a-f9459a87b2b4\") " pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.348409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gjs\" (UniqueName: \"kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs\") pod \"auto-csr-approver-29591328-z57td\" (UID: \"e385359c-ece1-46b6-be5a-f9459a87b2b4\") " pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.382639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gjs\" (UniqueName: \"kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs\") pod \"auto-csr-approver-29591328-z57td\" (UID: \"e385359c-ece1-46b6-be5a-f9459a87b2b4\") " pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.474997 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:00 crc kubenswrapper[4790]: I0406 12:48:00.997443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591328-z57td"] Apr 06 12:48:00 crc kubenswrapper[4790]: W0406 12:48:00.998129 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode385359c_ece1_46b6_be5a_f9459a87b2b4.slice/crio-3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69 WatchSource:0}: Error finding container 3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69: Status 404 returned error can't find the container with id 3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69 Apr 06 12:48:01 crc kubenswrapper[4790]: I0406 12:48:01.213378 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591328-z57td" event={"ID":"e385359c-ece1-46b6-be5a-f9459a87b2b4","Type":"ContainerStarted","Data":"3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69"} Apr 06 12:48:02 crc kubenswrapper[4790]: I0406 12:48:02.222944 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591328-z57td" event={"ID":"e385359c-ece1-46b6-be5a-f9459a87b2b4","Type":"ContainerStarted","Data":"6a5d02d3318596a24aeec648b0cd53eb30372e9a35902f8dd1961824f545bbe7"} Apr 06 12:48:02 crc kubenswrapper[4790]: I0406 12:48:02.240747 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591328-z57td" podStartSLOduration=1.423736344 podStartE2EDuration="2.240724012s" podCreationTimestamp="2026-04-06 12:48:00 +0000 UTC" firstStartedPulling="2026-04-06 12:48:01.002820962 +0000 UTC m=+3059.990563838" lastFinishedPulling="2026-04-06 12:48:01.81980862 +0000 UTC m=+3060.807551506" observedRunningTime="2026-04-06 12:48:02.233304084 +0000 UTC m=+3061.221046950" watchObservedRunningTime="2026-04-06 12:48:02.240724012 +0000 UTC m=+3061.228466878" Apr 06 12:48:03 crc kubenswrapper[4790]: I0406 12:48:03.233363 4790 generic.go:334] "Generic (PLEG): container finished" podID="e385359c-ece1-46b6-be5a-f9459a87b2b4" containerID="6a5d02d3318596a24aeec648b0cd53eb30372e9a35902f8dd1961824f545bbe7" exitCode=0 Apr 06 12:48:03 crc kubenswrapper[4790]: I0406 12:48:03.233482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591328-z57td" event={"ID":"e385359c-ece1-46b6-be5a-f9459a87b2b4","Type":"ContainerDied","Data":"6a5d02d3318596a24aeec648b0cd53eb30372e9a35902f8dd1961824f545bbe7"} Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.683508 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.779967 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591322-jn5fr"] Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.790925 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591322-jn5fr"] Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.839358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57gjs\" (UniqueName: \"kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs\") pod \"e385359c-ece1-46b6-be5a-f9459a87b2b4\" (UID: \"e385359c-ece1-46b6-be5a-f9459a87b2b4\") " Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.844904 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs" (OuterVolumeSpecName: "kube-api-access-57gjs") pod "e385359c-ece1-46b6-be5a-f9459a87b2b4" (UID: "e385359c-ece1-46b6-be5a-f9459a87b2b4"). InnerVolumeSpecName "kube-api-access-57gjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:48:04 crc kubenswrapper[4790]: I0406 12:48:04.942110 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57gjs\" (UniqueName: \"kubernetes.io/projected/e385359c-ece1-46b6-be5a-f9459a87b2b4-kube-api-access-57gjs\") on node \"crc\" DevicePath \"\"" Apr 06 12:48:05 crc kubenswrapper[4790]: I0406 12:48:05.260547 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591328-z57td" event={"ID":"e385359c-ece1-46b6-be5a-f9459a87b2b4","Type":"ContainerDied","Data":"3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69"} Apr 06 12:48:05 crc kubenswrapper[4790]: I0406 12:48:05.260604 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3699c82ff5591ac8dec24d929c7a5d01e415b24c28f840689ffa4d8ec0933d69" Apr 06 12:48:05 crc kubenswrapper[4790]: I0406 12:48:05.260682 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591328-z57td" Apr 06 12:48:05 crc kubenswrapper[4790]: I0406 12:48:05.686021 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceddc141-ac18-43f1-992b-c926d228d935" path="/var/lib/kubelet/pods/ceddc141-ac18-43f1-992b-c926d228d935/volumes" Apr 06 12:48:16 crc kubenswrapper[4790]: I0406 12:48:16.353444 4790 scope.go:117] "RemoveContainer" containerID="d0ac2cb5794a4855b19caf0710434fbf2f549022d5da222fbe496e7b7350adf5" Apr 06 12:48:39 crc kubenswrapper[4790]: I0406 12:48:39.754094 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:48:39 crc kubenswrapper[4790]: I0406 12:48:39.754710 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:49:09 crc kubenswrapper[4790]: I0406 12:49:09.753849 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:49:09 crc kubenswrapper[4790]: I0406 12:49:09.754325 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:49:39 crc kubenswrapper[4790]: I0406 12:49:39.753549 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:49:39 crc kubenswrapper[4790]: I0406 12:49:39.755116 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:49:39 crc kubenswrapper[4790]: I0406 12:49:39.755172 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:49:39 crc kubenswrapper[4790]: I0406 12:49:39.755939 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:49:39 crc kubenswrapper[4790]: I0406 12:49:39.755998 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" gracePeriod=600 Apr 06 12:49:39 crc kubenswrapper[4790]: E0406 12:49:39.884239 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:49:40 crc kubenswrapper[4790]: I0406 12:49:40.268539 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" exitCode=0 Apr 06 12:49:40 crc kubenswrapper[4790]: I0406 12:49:40.268605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4"} Apr 06 12:49:40 crc kubenswrapper[4790]: I0406 12:49:40.268661 4790 scope.go:117] "RemoveContainer" containerID="a02feda4773bfe4b216d2a90bd93f174a216aa7afd6f0bbb85f52142698b47a8" Apr 06 12:49:40 crc kubenswrapper[4790]: I0406 12:49:40.269726 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:49:40 crc kubenswrapper[4790]: E0406 12:49:40.270364 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:49:55 crc kubenswrapper[4790]: I0406 12:49:55.676246 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:49:55 crc kubenswrapper[4790]: E0406 12:49:55.677489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.165028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591330-jj5gp"] Apr 06 12:50:00 crc kubenswrapper[4790]: E0406 12:50:00.169244 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e385359c-ece1-46b6-be5a-f9459a87b2b4" containerName="oc" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.169459 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e385359c-ece1-46b6-be5a-f9459a87b2b4" containerName="oc" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.169953 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e385359c-ece1-46b6-be5a-f9459a87b2b4" containerName="oc" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.171059 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.176021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.176702 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.178415 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.197649 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591330-jj5gp"] Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.284756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqvz\" (UniqueName: \"kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz\") pod \"auto-csr-approver-29591330-jj5gp\" (UID: \"589d79d7-98b7-4959-9918-7458631e8ef1\") " pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.386659 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqvz\" (UniqueName: \"kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz\") pod \"auto-csr-approver-29591330-jj5gp\" (UID: \"589d79d7-98b7-4959-9918-7458631e8ef1\") " pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.408997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqvz\" (UniqueName: \"kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz\") pod \"auto-csr-approver-29591330-jj5gp\" (UID: \"589d79d7-98b7-4959-9918-7458631e8ef1\") " pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.506456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.980411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591330-jj5gp"] Apr 06 12:50:00 crc kubenswrapper[4790]: I0406 12:50:00.981865 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:50:01 crc kubenswrapper[4790]: I0406 12:50:01.496231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" event={"ID":"589d79d7-98b7-4959-9918-7458631e8ef1","Type":"ContainerStarted","Data":"9e4d3ca97de61253941f914cc1e86f3b0120c718b5b214cfbab9816582cf36f2"} Apr 06 12:50:03 crc kubenswrapper[4790]: I0406 12:50:03.517778 4790 generic.go:334] "Generic (PLEG): container finished" podID="589d79d7-98b7-4959-9918-7458631e8ef1" containerID="3f73127252cc52f1af39b0afd8cdf02281bd976f59ac49b6ced1accc31d4dede" exitCode=0 Apr 06 12:50:03 crc kubenswrapper[4790]: I0406 12:50:03.517881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" event={"ID":"589d79d7-98b7-4959-9918-7458631e8ef1","Type":"ContainerDied","Data":"3f73127252cc52f1af39b0afd8cdf02281bd976f59ac49b6ced1accc31d4dede"} Apr 06 12:50:04 crc kubenswrapper[4790]: I0406 12:50:04.942983 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.083857 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgqvz\" (UniqueName: \"kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz\") pod \"589d79d7-98b7-4959-9918-7458631e8ef1\" (UID: \"589d79d7-98b7-4959-9918-7458631e8ef1\") " Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.088623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz" (OuterVolumeSpecName: "kube-api-access-qgqvz") pod "589d79d7-98b7-4959-9918-7458631e8ef1" (UID: "589d79d7-98b7-4959-9918-7458631e8ef1"). InnerVolumeSpecName "kube-api-access-qgqvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.186556 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgqvz\" (UniqueName: \"kubernetes.io/projected/589d79d7-98b7-4959-9918-7458631e8ef1-kube-api-access-qgqvz\") on node \"crc\" DevicePath \"\"" Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.542446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" event={"ID":"589d79d7-98b7-4959-9918-7458631e8ef1","Type":"ContainerDied","Data":"9e4d3ca97de61253941f914cc1e86f3b0120c718b5b214cfbab9816582cf36f2"} Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.542507 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e4d3ca97de61253941f914cc1e86f3b0120c718b5b214cfbab9816582cf36f2" Apr 06 12:50:05 crc kubenswrapper[4790]: I0406 12:50:05.542582 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591330-jj5gp" Apr 06 12:50:06 crc kubenswrapper[4790]: I0406 12:50:06.017749 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591324-mbr26"] Apr 06 12:50:06 crc kubenswrapper[4790]: I0406 12:50:06.028753 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591324-mbr26"] Apr 06 12:50:07 crc kubenswrapper[4790]: I0406 12:50:07.692752 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0cc9f5-3236-42b5-adc2-85cf616a156f" path="/var/lib/kubelet/pods/bb0cc9f5-3236-42b5-adc2-85cf616a156f/volumes" Apr 06 12:50:09 crc kubenswrapper[4790]: I0406 12:50:09.676757 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:50:09 crc kubenswrapper[4790]: E0406 12:50:09.677304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:50:16 crc kubenswrapper[4790]: I0406 12:50:16.473163 4790 scope.go:117] "RemoveContainer" containerID="5feafff1976253487b3166ec4252b7c533c32366ca9bc0a11eec2c80a8ae8e2b" Apr 06 12:50:20 crc kubenswrapper[4790]: I0406 12:50:20.676182 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:50:20 crc kubenswrapper[4790]: E0406 12:50:20.677200 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:50:32 crc kubenswrapper[4790]: I0406 12:50:32.676134 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:50:32 crc kubenswrapper[4790]: E0406 12:50:32.678006 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:50:46 crc kubenswrapper[4790]: I0406 12:50:46.676096 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:50:46 crc kubenswrapper[4790]: E0406 12:50:46.676774 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:50:58 crc kubenswrapper[4790]: I0406 12:50:58.676069 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:50:58 crc kubenswrapper[4790]: E0406 12:50:58.677251 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:51:09 crc kubenswrapper[4790]: I0406 12:51:09.676906 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:51:09 crc kubenswrapper[4790]: E0406 12:51:09.678329 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:51:23 crc kubenswrapper[4790]: I0406 12:51:23.676069 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:51:23 crc kubenswrapper[4790]: E0406 12:51:23.677083 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.191680 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:25 crc kubenswrapper[4790]: E0406 12:51:25.194214 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589d79d7-98b7-4959-9918-7458631e8ef1" containerName="oc" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.194233 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="589d79d7-98b7-4959-9918-7458631e8ef1" containerName="oc" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.194489 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="589d79d7-98b7-4959-9918-7458631e8ef1" containerName="oc" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.195901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.209393 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.268846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.269115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd9vf\" (UniqueName: \"kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.269597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.371768 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.371923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.371975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd9vf\" (UniqueName: \"kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.372370 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.372460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.402678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd9vf\" (UniqueName: \"kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf\") pod \"certified-operators-djsqp\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:25 crc kubenswrapper[4790]: I0406 12:51:25.532275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:26 crc kubenswrapper[4790]: I0406 12:51:26.106165 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:26 crc kubenswrapper[4790]: I0406 12:51:26.439269 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1a5d573-a090-4862-92e4-eb717720d897" containerID="6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5" exitCode=0 Apr 06 12:51:26 crc kubenswrapper[4790]: I0406 12:51:26.439315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerDied","Data":"6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5"} Apr 06 12:51:26 crc kubenswrapper[4790]: I0406 12:51:26.439588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerStarted","Data":"5e82b60cf5989438b57368140726cef2bf84819e67477bc50ccf5bd365da91ad"} Apr 06 12:51:27 crc kubenswrapper[4790]: I0406 12:51:27.449921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerStarted","Data":"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557"} Apr 06 12:51:29 crc kubenswrapper[4790]: I0406 12:51:29.469522 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1a5d573-a090-4862-92e4-eb717720d897" containerID="60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557" exitCode=0 Apr 06 12:51:29 crc kubenswrapper[4790]: I0406 12:51:29.469618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerDied","Data":"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557"} Apr 06 12:51:30 crc kubenswrapper[4790]: I0406 12:51:30.484854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerStarted","Data":"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae"} Apr 06 12:51:30 crc kubenswrapper[4790]: I0406 12:51:30.510354 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-djsqp" podStartSLOduration=2.122767496 podStartE2EDuration="5.510332707s" podCreationTimestamp="2026-04-06 12:51:25 +0000 UTC" firstStartedPulling="2026-04-06 12:51:26.441126695 +0000 UTC m=+3265.428869561" lastFinishedPulling="2026-04-06 12:51:29.828691916 +0000 UTC m=+3268.816434772" observedRunningTime="2026-04-06 12:51:30.508294352 +0000 UTC m=+3269.496037218" watchObservedRunningTime="2026-04-06 12:51:30.510332707 +0000 UTC m=+3269.498075573" Apr 06 12:51:35 crc kubenswrapper[4790]: I0406 12:51:35.533084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:35 crc kubenswrapper[4790]: I0406 12:51:35.533623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:35 crc kubenswrapper[4790]: I0406 12:51:35.578546 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:36 crc kubenswrapper[4790]: I0406 12:51:36.608474 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:36 crc kubenswrapper[4790]: I0406 12:51:36.653675 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:37 crc kubenswrapper[4790]: I0406 12:51:37.675918 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:51:37 crc kubenswrapper[4790]: E0406 12:51:37.676388 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:51:38 crc kubenswrapper[4790]: I0406 12:51:38.574273 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-djsqp" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="registry-server" containerID="cri-o://774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae" gracePeriod=2 Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.117513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.275172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content\") pod \"d1a5d573-a090-4862-92e4-eb717720d897\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.275244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities\") pod \"d1a5d573-a090-4862-92e4-eb717720d897\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.275267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd9vf\" (UniqueName: \"kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf\") pod \"d1a5d573-a090-4862-92e4-eb717720d897\" (UID: \"d1a5d573-a090-4862-92e4-eb717720d897\") " Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.276924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities" (OuterVolumeSpecName: "utilities") pod "d1a5d573-a090-4862-92e4-eb717720d897" (UID: "d1a5d573-a090-4862-92e4-eb717720d897"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.285029 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf" (OuterVolumeSpecName: "kube-api-access-vd9vf") pod "d1a5d573-a090-4862-92e4-eb717720d897" (UID: "d1a5d573-a090-4862-92e4-eb717720d897"). InnerVolumeSpecName "kube-api-access-vd9vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.344241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1a5d573-a090-4862-92e4-eb717720d897" (UID: "d1a5d573-a090-4862-92e4-eb717720d897"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.378246 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.378293 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1a5d573-a090-4862-92e4-eb717720d897-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.378311 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd9vf\" (UniqueName: \"kubernetes.io/projected/d1a5d573-a090-4862-92e4-eb717720d897-kube-api-access-vd9vf\") on node \"crc\" DevicePath \"\"" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.588204 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1a5d573-a090-4862-92e4-eb717720d897" containerID="774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae" exitCode=0 Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.588264 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-djsqp" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.588287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerDied","Data":"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae"} Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.588726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-djsqp" event={"ID":"d1a5d573-a090-4862-92e4-eb717720d897","Type":"ContainerDied","Data":"5e82b60cf5989438b57368140726cef2bf84819e67477bc50ccf5bd365da91ad"} Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.588753 4790 scope.go:117] "RemoveContainer" containerID="774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.616995 4790 scope.go:117] "RemoveContainer" containerID="60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.646968 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.653765 4790 scope.go:117] "RemoveContainer" containerID="6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.657368 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-djsqp"] Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.717916 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a5d573-a090-4862-92e4-eb717720d897" path="/var/lib/kubelet/pods/d1a5d573-a090-4862-92e4-eb717720d897/volumes" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.741963 4790 scope.go:117] "RemoveContainer" containerID="774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae" Apr 06 12:51:39 crc kubenswrapper[4790]: E0406 12:51:39.745653 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae\": container with ID starting with 774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae not found: ID does not exist" containerID="774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.745761 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae"} err="failed to get container status \"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae\": rpc error: code = NotFound desc = could not find container \"774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae\": container with ID starting with 774daf87b8a27067ec60064487029a991487578c39d6b2a902ad6f436bd378ae not found: ID does not exist" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.745790 4790 scope.go:117] "RemoveContainer" containerID="60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557" Apr 06 12:51:39 crc kubenswrapper[4790]: E0406 12:51:39.746275 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557\": container with ID starting with 60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557 not found: ID does not exist" containerID="60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.746330 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557"} err="failed to get container status \"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557\": rpc error: code = NotFound desc = could not find container \"60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557\": container with ID starting with 60e0747184f4ca7bcd78dd24debeb09a30080fc9f850df57fbda04e16c7ff557 not found: ID does not exist" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.746359 4790 scope.go:117] "RemoveContainer" containerID="6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5" Apr 06 12:51:39 crc kubenswrapper[4790]: E0406 12:51:39.746799 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5\": container with ID starting with 6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5 not found: ID does not exist" containerID="6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5" Apr 06 12:51:39 crc kubenswrapper[4790]: I0406 12:51:39.746902 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5"} err="failed to get container status \"6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5\": rpc error: code = NotFound desc = could not find container \"6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5\": container with ID starting with 6ee4882fd152925b4873235638c8ed915920bc7510c44168733f18e06836bab5 not found: ID does not exist" Apr 06 12:51:49 crc kubenswrapper[4790]: I0406 12:51:49.676088 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:51:49 crc kubenswrapper[4790]: E0406 12:51:49.677286 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.144543 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591332-fbpvh"] Apr 06 12:52:00 crc kubenswrapper[4790]: E0406 12:52:00.145628 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="extract-utilities" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.145648 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="extract-utilities" Apr 06 12:52:00 crc kubenswrapper[4790]: E0406 12:52:00.145664 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="registry-server" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.145673 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="registry-server" Apr 06 12:52:00 crc kubenswrapper[4790]: E0406 12:52:00.145695 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="extract-content" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.145704 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="extract-content" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.146019 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a5d573-a090-4862-92e4-eb717720d897" containerName="registry-server" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.146825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.149281 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.149351 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.149786 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.163770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591332-fbpvh"] Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.267295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqz4\" (UniqueName: \"kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4\") pod \"auto-csr-approver-29591332-fbpvh\" (UID: \"d1809bcc-7651-4043-a338-e0f0d5bcf115\") " pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.370346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqz4\" (UniqueName: \"kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4\") pod \"auto-csr-approver-29591332-fbpvh\" (UID: \"d1809bcc-7651-4043-a338-e0f0d5bcf115\") " pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.444084 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqz4\" (UniqueName: \"kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4\") pod \"auto-csr-approver-29591332-fbpvh\" (UID: \"d1809bcc-7651-4043-a338-e0f0d5bcf115\") " pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.467149 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:00 crc kubenswrapper[4790]: I0406 12:52:00.959873 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591332-fbpvh"] Apr 06 12:52:01 crc kubenswrapper[4790]: I0406 12:52:01.819623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" event={"ID":"d1809bcc-7651-4043-a338-e0f0d5bcf115","Type":"ContainerStarted","Data":"1dc7b6080144e378b9b559eb7d373f97c2457f22d5dbc2d94ef99c4ffda23b21"} Apr 06 12:52:02 crc kubenswrapper[4790]: I0406 12:52:02.833531 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1809bcc-7651-4043-a338-e0f0d5bcf115" containerID="519b03e39ed8835d8031a2634b1c0ab325bec95d9f706006e4dca0fa3dd3ff80" exitCode=0 Apr 06 12:52:02 crc kubenswrapper[4790]: I0406 12:52:02.833587 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" event={"ID":"d1809bcc-7651-4043-a338-e0f0d5bcf115","Type":"ContainerDied","Data":"519b03e39ed8835d8031a2634b1c0ab325bec95d9f706006e4dca0fa3dd3ff80"} Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.200858 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.354058 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxqz4\" (UniqueName: \"kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4\") pod \"d1809bcc-7651-4043-a338-e0f0d5bcf115\" (UID: \"d1809bcc-7651-4043-a338-e0f0d5bcf115\") " Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.360539 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4" (OuterVolumeSpecName: "kube-api-access-qxqz4") pod "d1809bcc-7651-4043-a338-e0f0d5bcf115" (UID: "d1809bcc-7651-4043-a338-e0f0d5bcf115"). InnerVolumeSpecName "kube-api-access-qxqz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.456978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxqz4\" (UniqueName: \"kubernetes.io/projected/d1809bcc-7651-4043-a338-e0f0d5bcf115-kube-api-access-qxqz4\") on node \"crc\" DevicePath \"\"" Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.675706 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:52:04 crc kubenswrapper[4790]: E0406 12:52:04.676182 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.869100 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" event={"ID":"d1809bcc-7651-4043-a338-e0f0d5bcf115","Type":"ContainerDied","Data":"1dc7b6080144e378b9b559eb7d373f97c2457f22d5dbc2d94ef99c4ffda23b21"} Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.869145 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dc7b6080144e378b9b559eb7d373f97c2457f22d5dbc2d94ef99c4ffda23b21" Apr 06 12:52:04 crc kubenswrapper[4790]: I0406 12:52:04.869285 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591332-fbpvh" Apr 06 12:52:05 crc kubenswrapper[4790]: I0406 12:52:05.287020 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591326-vrcnz"] Apr 06 12:52:05 crc kubenswrapper[4790]: I0406 12:52:05.296407 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591326-vrcnz"] Apr 06 12:52:05 crc kubenswrapper[4790]: I0406 12:52:05.694590 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4974ec07-5087-408a-9cfe-644f77cae8e1" path="/var/lib/kubelet/pods/4974ec07-5087-408a-9cfe-644f77cae8e1/volumes" Apr 06 12:52:16 crc kubenswrapper[4790]: I0406 12:52:16.609419 4790 scope.go:117] "RemoveContainer" containerID="3b6f0ad4a78833c522b406bd16acb5328f7e2ee4c04f8c63b5a0045d902d1e82" Apr 06 12:52:17 crc kubenswrapper[4790]: I0406 12:52:17.676349 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:52:17 crc kubenswrapper[4790]: E0406 12:52:17.677020 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:31 crc kubenswrapper[4790]: I0406 12:52:31.693939 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:52:31 crc kubenswrapper[4790]: E0406 12:52:31.694974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:42 crc kubenswrapper[4790]: I0406 12:52:42.675616 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:52:42 crc kubenswrapper[4790]: E0406 12:52:42.676416 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:53 crc kubenswrapper[4790]: I0406 12:52:53.675578 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:52:53 crc kubenswrapper[4790]: E0406 12:52:53.676335 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.376235 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:52:58 crc kubenswrapper[4790]: E0406 12:52:58.377413 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1809bcc-7651-4043-a338-e0f0d5bcf115" containerName="oc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.377431 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1809bcc-7651-4043-a338-e0f0d5bcf115" containerName="oc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.377684 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1809bcc-7651-4043-a338-e0f0d5bcf115" containerName="oc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.379712 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.398965 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.557714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.557816 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.558268 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lr58\" (UniqueName: \"kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.659904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.659979 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.660068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lr58\" (UniqueName: \"kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.660614 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.660680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.696454 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lr58\" (UniqueName: \"kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58\") pod \"redhat-operators-2bxfc\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:58 crc kubenswrapper[4790]: I0406 12:52:58.702401 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:52:59 crc kubenswrapper[4790]: I0406 12:52:59.268028 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:52:59 crc kubenswrapper[4790]: I0406 12:52:59.526553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerStarted","Data":"482d7b242069b6bee24c4c1a29e3361c32d450a8338cf738746055c100457813"} Apr 06 12:53:00 crc kubenswrapper[4790]: I0406 12:53:00.548059 4790 generic.go:334] "Generic (PLEG): container finished" podID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerID="0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483" exitCode=0 Apr 06 12:53:00 crc kubenswrapper[4790]: I0406 12:53:00.548372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerDied","Data":"0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483"} Apr 06 12:53:02 crc kubenswrapper[4790]: I0406 12:53:02.574597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerStarted","Data":"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c"} Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.759303 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.761980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.775487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.885587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.885664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.885709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdcd\" (UniqueName: \"kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.987903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.987980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdcd\" (UniqueName: \"kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.988143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.988432 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:03 crc kubenswrapper[4790]: I0406 12:53:03.988701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:04 crc kubenswrapper[4790]: I0406 12:53:04.009406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdcd\" (UniqueName: \"kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd\") pod \"redhat-marketplace-skkp4\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:04 crc kubenswrapper[4790]: I0406 12:53:04.084694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:04 crc kubenswrapper[4790]: I0406 12:53:04.854360 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:05 crc kubenswrapper[4790]: I0406 12:53:05.606719 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerID="c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc" exitCode=0 Apr 06 12:53:05 crc kubenswrapper[4790]: I0406 12:53:05.607059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerDied","Data":"c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc"} Apr 06 12:53:05 crc kubenswrapper[4790]: I0406 12:53:05.607097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerStarted","Data":"c8608b735ae5aad0164b2acdf127e6426f4312b26563622b7a9fff353635eea8"} Apr 06 12:53:07 crc kubenswrapper[4790]: I0406 12:53:07.814539 4790 generic.go:334] "Generic (PLEG): container finished" podID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerID="2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c" exitCode=0 Apr 06 12:53:07 crc kubenswrapper[4790]: I0406 12:53:07.816008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerDied","Data":"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c"} Apr 06 12:53:07 crc kubenswrapper[4790]: I0406 12:53:07.822441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerStarted","Data":"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01"} Apr 06 12:53:08 crc kubenswrapper[4790]: I0406 12:53:08.676059 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:53:08 crc kubenswrapper[4790]: E0406 12:53:08.676472 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:53:08 crc kubenswrapper[4790]: I0406 12:53:08.832757 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerID="a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01" exitCode=0 Apr 06 12:53:08 crc kubenswrapper[4790]: I0406 12:53:08.832906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerDied","Data":"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01"} Apr 06 12:53:08 crc kubenswrapper[4790]: I0406 12:53:08.837184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerStarted","Data":"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7"} Apr 06 12:53:08 crc kubenswrapper[4790]: I0406 12:53:08.882949 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2bxfc" podStartSLOduration=3.239821537 podStartE2EDuration="10.882924558s" podCreationTimestamp="2026-04-06 12:52:58 +0000 UTC" firstStartedPulling="2026-04-06 12:53:00.550812509 +0000 UTC m=+3359.538555375" lastFinishedPulling="2026-04-06 12:53:08.19391553 +0000 UTC m=+3367.181658396" observedRunningTime="2026-04-06 12:53:08.872270153 +0000 UTC m=+3367.860013019" watchObservedRunningTime="2026-04-06 12:53:08.882924558 +0000 UTC m=+3367.870667444" Apr 06 12:53:09 crc kubenswrapper[4790]: I0406 12:53:09.851621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerStarted","Data":"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba"} Apr 06 12:53:09 crc kubenswrapper[4790]: I0406 12:53:09.872018 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skkp4" podStartSLOduration=3.199158381 podStartE2EDuration="6.872001725s" podCreationTimestamp="2026-04-06 12:53:03 +0000 UTC" firstStartedPulling="2026-04-06 12:53:05.611857724 +0000 UTC m=+3364.599600610" lastFinishedPulling="2026-04-06 12:53:09.284701088 +0000 UTC m=+3368.272443954" observedRunningTime="2026-04-06 12:53:09.867766771 +0000 UTC m=+3368.855509657" watchObservedRunningTime="2026-04-06 12:53:09.872001725 +0000 UTC m=+3368.859744581" Apr 06 12:53:14 crc kubenswrapper[4790]: I0406 12:53:14.085292 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:14 crc kubenswrapper[4790]: I0406 12:53:14.085895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:14 crc kubenswrapper[4790]: I0406 12:53:14.142064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:14 crc kubenswrapper[4790]: I0406 12:53:14.949477 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:15 crc kubenswrapper[4790]: I0406 12:53:15.008481 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:16 crc kubenswrapper[4790]: I0406 12:53:16.923161 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skkp4" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="registry-server" containerID="cri-o://76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba" gracePeriod=2 Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.421209 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.587376 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities\") pod \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.587493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdcd\" (UniqueName: \"kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd\") pod \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.587632 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content\") pod \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\" (UID: \"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6\") " Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.588787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities" (OuterVolumeSpecName: "utilities") pod "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" (UID: "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.593057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd" (OuterVolumeSpecName: "kube-api-access-mvdcd") pod "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" (UID: "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6"). InnerVolumeSpecName "kube-api-access-mvdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.618653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" (UID: "8f055c13-d730-4e8e-8e3c-a7be15a6c5f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.689665 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.689695 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.689705 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdcd\" (UniqueName: \"kubernetes.io/projected/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6-kube-api-access-mvdcd\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.934119 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerID="76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba" exitCode=0 Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.934179 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skkp4" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.934194 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerDied","Data":"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba"} Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.934227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skkp4" event={"ID":"8f055c13-d730-4e8e-8e3c-a7be15a6c5f6","Type":"ContainerDied","Data":"c8608b735ae5aad0164b2acdf127e6426f4312b26563622b7a9fff353635eea8"} Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.934267 4790 scope.go:117] "RemoveContainer" containerID="76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.957689 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.965899 4790 scope.go:117] "RemoveContainer" containerID="a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01" Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.967306 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skkp4"] Apr 06 12:53:17 crc kubenswrapper[4790]: I0406 12:53:17.984383 4790 scope.go:117] "RemoveContainer" containerID="c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.034112 4790 scope.go:117] "RemoveContainer" containerID="76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba" Apr 06 12:53:18 crc kubenswrapper[4790]: E0406 12:53:18.034771 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba\": container with ID starting with 76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba not found: ID does not exist" containerID="76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.034839 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba"} err="failed to get container status \"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba\": rpc error: code = NotFound desc = could not find container \"76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba\": container with ID starting with 76d329c357ae061c5999aca6def5d9a4ad0752f8b3634091425ddd1afd5084ba not found: ID does not exist" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.034869 4790 scope.go:117] "RemoveContainer" containerID="a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01" Apr 06 12:53:18 crc kubenswrapper[4790]: E0406 12:53:18.035176 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01\": container with ID starting with a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01 not found: ID does not exist" containerID="a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.035208 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01"} err="failed to get container status \"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01\": rpc error: code = NotFound desc = could not find container \"a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01\": container with ID starting with a436bde83d1486b1654ac2de645f8fa8075b7518b6eef3b12224d3f05fc80b01 not found: ID does not exist" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.035231 4790 scope.go:117] "RemoveContainer" containerID="c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc" Apr 06 12:53:18 crc kubenswrapper[4790]: E0406 12:53:18.035549 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc\": container with ID starting with c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc not found: ID does not exist" containerID="c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.035571 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc"} err="failed to get container status \"c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc\": rpc error: code = NotFound desc = could not find container \"c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc\": container with ID starting with c6300f4f87e30aa88e8bc7d03b7b6468106e9237123bc57415a5105298a7e7bc not found: ID does not exist" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.703305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:18 crc kubenswrapper[4790]: I0406 12:53:18.703605 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:19 crc kubenswrapper[4790]: I0406 12:53:19.689123 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" path="/var/lib/kubelet/pods/8f055c13-d730-4e8e-8e3c-a7be15a6c5f6/volumes" Apr 06 12:53:19 crc kubenswrapper[4790]: I0406 12:53:19.757262 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2bxfc" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="registry-server" probeResult="failure" output=< Apr 06 12:53:19 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:53:19 crc kubenswrapper[4790]: > Apr 06 12:53:22 crc kubenswrapper[4790]: I0406 12:53:22.675703 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:53:22 crc kubenswrapper[4790]: E0406 12:53:22.676335 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:53:28 crc kubenswrapper[4790]: I0406 12:53:28.765377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:28 crc kubenswrapper[4790]: I0406 12:53:28.841383 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:29 crc kubenswrapper[4790]: I0406 12:53:29.596428 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.079104 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2bxfc" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="registry-server" containerID="cri-o://3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7" gracePeriod=2 Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.608328 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.674100 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lr58\" (UniqueName: \"kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58\") pod \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.674252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities\") pod \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.674409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content\") pod \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\" (UID: \"182c63aa-5a5e-4163-be5a-3bd36ed45ed8\") " Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.674971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities" (OuterVolumeSpecName: "utilities") pod "182c63aa-5a5e-4163-be5a-3bd36ed45ed8" (UID: "182c63aa-5a5e-4163-be5a-3bd36ed45ed8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.681039 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58" (OuterVolumeSpecName: "kube-api-access-7lr58") pod "182c63aa-5a5e-4163-be5a-3bd36ed45ed8" (UID: "182c63aa-5a5e-4163-be5a-3bd36ed45ed8"). InnerVolumeSpecName "kube-api-access-7lr58". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.776307 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lr58\" (UniqueName: \"kubernetes.io/projected/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-kube-api-access-7lr58\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.776459 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.817518 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "182c63aa-5a5e-4163-be5a-3bd36ed45ed8" (UID: "182c63aa-5a5e-4163-be5a-3bd36ed45ed8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:53:30 crc kubenswrapper[4790]: I0406 12:53:30.878307 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182c63aa-5a5e-4163-be5a-3bd36ed45ed8-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.094776 4790 generic.go:334] "Generic (PLEG): container finished" podID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerID="3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7" exitCode=0 Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.094879 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2bxfc" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.094893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerDied","Data":"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7"} Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.094927 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2bxfc" event={"ID":"182c63aa-5a5e-4163-be5a-3bd36ed45ed8","Type":"ContainerDied","Data":"482d7b242069b6bee24c4c1a29e3361c32d450a8338cf738746055c100457813"} Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.094945 4790 scope.go:117] "RemoveContainer" containerID="3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.119376 4790 scope.go:117] "RemoveContainer" containerID="2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.138986 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.161081 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2bxfc"] Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.163414 4790 scope.go:117] "RemoveContainer" containerID="0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.206808 4790 scope.go:117] "RemoveContainer" containerID="3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7" Apr 06 12:53:31 crc kubenswrapper[4790]: E0406 12:53:31.207564 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7\": container with ID starting with 3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7 not found: ID does not exist" containerID="3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.207612 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7"} err="failed to get container status \"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7\": rpc error: code = NotFound desc = could not find container \"3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7\": container with ID starting with 3d9138b1fbc7b47c129dfc91a90bbf70bbf7ae71095d260ccc1685e99603dfe7 not found: ID does not exist" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.207637 4790 scope.go:117] "RemoveContainer" containerID="2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c" Apr 06 12:53:31 crc kubenswrapper[4790]: E0406 12:53:31.208101 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c\": container with ID starting with 2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c not found: ID does not exist" containerID="2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.208121 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c"} err="failed to get container status \"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c\": rpc error: code = NotFound desc = could not find container \"2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c\": container with ID starting with 2abce567f09704b6a5cdc8c130effb5bb9249f8d0e8a18ca1d4f8674df3b087c not found: ID does not exist" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.208140 4790 scope.go:117] "RemoveContainer" containerID="0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483" Apr 06 12:53:31 crc kubenswrapper[4790]: E0406 12:53:31.208359 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483\": container with ID starting with 0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483 not found: ID does not exist" containerID="0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.208392 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483"} err="failed to get container status \"0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483\": rpc error: code = NotFound desc = could not find container \"0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483\": container with ID starting with 0019d1b0f72ef602084ec679ef6d6da195178331a7206f0484bd5ef88fe51483 not found: ID does not exist" Apr 06 12:53:31 crc kubenswrapper[4790]: I0406 12:53:31.687288 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" path="/var/lib/kubelet/pods/182c63aa-5a5e-4163-be5a-3bd36ed45ed8/volumes" Apr 06 12:53:35 crc kubenswrapper[4790]: I0406 12:53:35.675365 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:53:35 crc kubenswrapper[4790]: E0406 12:53:35.676073 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:53:50 crc kubenswrapper[4790]: I0406 12:53:50.676425 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:53:50 crc kubenswrapper[4790]: E0406 12:53:50.677282 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.152321 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591334-8whrg"] Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.154582 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.154686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.154756 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="extract-content" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.154818 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="extract-content" Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.154939 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="extract-utilities" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155001 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="extract-utilities" Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.155083 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155146 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.155238 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="extract-utilities" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155320 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="extract-utilities" Apr 06 12:54:00 crc kubenswrapper[4790]: E0406 12:54:00.155403 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="extract-content" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155470 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="extract-content" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155717 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="182c63aa-5a5e-4163-be5a-3bd36ed45ed8" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.155798 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f055c13-d730-4e8e-8e3c-a7be15a6c5f6" containerName="registry-server" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.156616 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.160025 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.160037 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.160124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.166876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591334-8whrg"] Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.236701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4cr\" (UniqueName: \"kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr\") pod \"auto-csr-approver-29591334-8whrg\" (UID: \"d2e7b2ae-dcb3-4889-b512-be52cf460f33\") " pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.339197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4cr\" (UniqueName: \"kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr\") pod \"auto-csr-approver-29591334-8whrg\" (UID: \"d2e7b2ae-dcb3-4889-b512-be52cf460f33\") " pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.357557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4cr\" (UniqueName: \"kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr\") pod \"auto-csr-approver-29591334-8whrg\" (UID: \"d2e7b2ae-dcb3-4889-b512-be52cf460f33\") " pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.481465 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:00 crc kubenswrapper[4790]: I0406 12:54:00.963844 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591334-8whrg"] Apr 06 12:54:01 crc kubenswrapper[4790]: I0406 12:54:01.404252 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591334-8whrg" event={"ID":"d2e7b2ae-dcb3-4889-b512-be52cf460f33","Type":"ContainerStarted","Data":"2b168e7c9af0595e27f957e63265d8933cd61d2689be0d15f6a6398ccba9b3bf"} Apr 06 12:54:01 crc kubenswrapper[4790]: I0406 12:54:01.684325 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:54:01 crc kubenswrapper[4790]: E0406 12:54:01.685915 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:54:02 crc kubenswrapper[4790]: I0406 12:54:02.415351 4790 generic.go:334] "Generic (PLEG): container finished" podID="d2e7b2ae-dcb3-4889-b512-be52cf460f33" containerID="9e0ab845231065729c631421c49f11a6c13a33755a088e90b84e2fbefbd06c21" exitCode=0 Apr 06 12:54:02 crc kubenswrapper[4790]: I0406 12:54:02.415544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591334-8whrg" event={"ID":"d2e7b2ae-dcb3-4889-b512-be52cf460f33","Type":"ContainerDied","Data":"9e0ab845231065729c631421c49f11a6c13a33755a088e90b84e2fbefbd06c21"} Apr 06 12:54:03 crc kubenswrapper[4790]: I0406 12:54:03.807162 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:03 crc kubenswrapper[4790]: I0406 12:54:03.938031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g4cr\" (UniqueName: \"kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr\") pod \"d2e7b2ae-dcb3-4889-b512-be52cf460f33\" (UID: \"d2e7b2ae-dcb3-4889-b512-be52cf460f33\") " Apr 06 12:54:03 crc kubenswrapper[4790]: I0406 12:54:03.944627 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr" (OuterVolumeSpecName: "kube-api-access-8g4cr") pod "d2e7b2ae-dcb3-4889-b512-be52cf460f33" (UID: "d2e7b2ae-dcb3-4889-b512-be52cf460f33"). InnerVolumeSpecName "kube-api-access-8g4cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.040606 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g4cr\" (UniqueName: \"kubernetes.io/projected/d2e7b2ae-dcb3-4889-b512-be52cf460f33-kube-api-access-8g4cr\") on node \"crc\" DevicePath \"\"" Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.438349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591334-8whrg" event={"ID":"d2e7b2ae-dcb3-4889-b512-be52cf460f33","Type":"ContainerDied","Data":"2b168e7c9af0595e27f957e63265d8933cd61d2689be0d15f6a6398ccba9b3bf"} Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.438395 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b168e7c9af0595e27f957e63265d8933cd61d2689be0d15f6a6398ccba9b3bf" Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.438430 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591334-8whrg" Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.879622 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591328-z57td"] Apr 06 12:54:04 crc kubenswrapper[4790]: I0406 12:54:04.893012 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591328-z57td"] Apr 06 12:54:05 crc kubenswrapper[4790]: I0406 12:54:05.688137 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e385359c-ece1-46b6-be5a-f9459a87b2b4" path="/var/lib/kubelet/pods/e385359c-ece1-46b6-be5a-f9459a87b2b4/volumes" Apr 06 12:54:12 crc kubenswrapper[4790]: I0406 12:54:12.675648 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:54:12 crc kubenswrapper[4790]: E0406 12:54:12.676590 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:54:16 crc kubenswrapper[4790]: I0406 12:54:16.737516 4790 scope.go:117] "RemoveContainer" containerID="6a5d02d3318596a24aeec648b0cd53eb30372e9a35902f8dd1961824f545bbe7" Apr 06 12:54:23 crc kubenswrapper[4790]: I0406 12:54:23.675925 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:54:23 crc kubenswrapper[4790]: E0406 12:54:23.676818 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:54:34 crc kubenswrapper[4790]: I0406 12:54:34.675851 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:54:34 crc kubenswrapper[4790]: E0406 12:54:34.676633 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 12:54:46 crc kubenswrapper[4790]: I0406 12:54:46.675586 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:54:46 crc kubenswrapper[4790]: I0406 12:54:46.952301 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a"} Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.034909 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:04 crc kubenswrapper[4790]: E0406 12:55:04.035866 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e7b2ae-dcb3-4889-b512-be52cf460f33" containerName="oc" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.035881 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e7b2ae-dcb3-4889-b512-be52cf460f33" containerName="oc" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.036087 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e7b2ae-dcb3-4889-b512-be52cf460f33" containerName="oc" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.037586 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.049643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.126769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.126924 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.127078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn9mq\" (UniqueName: \"kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.231223 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.231339 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.231498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn9mq\" (UniqueName: \"kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.232628 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.232897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.282055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn9mq\" (UniqueName: \"kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq\") pod \"community-operators-9znrs\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.359411 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:04 crc kubenswrapper[4790]: I0406 12:55:04.946219 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:05 crc kubenswrapper[4790]: I0406 12:55:05.156658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerStarted","Data":"47ede21d1e6d6558c9b16343530dfe125191890f6741655588648062e90e71a8"} Apr 06 12:55:06 crc kubenswrapper[4790]: I0406 12:55:06.167461 4790 generic.go:334] "Generic (PLEG): container finished" podID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerID="8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90" exitCode=0 Apr 06 12:55:06 crc kubenswrapper[4790]: I0406 12:55:06.167517 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerDied","Data":"8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90"} Apr 06 12:55:06 crc kubenswrapper[4790]: I0406 12:55:06.169794 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 12:55:07 crc kubenswrapper[4790]: I0406 12:55:07.188437 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerStarted","Data":"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785"} Apr 06 12:55:12 crc kubenswrapper[4790]: I0406 12:55:12.239517 4790 generic.go:334] "Generic (PLEG): container finished" podID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerID="37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785" exitCode=0 Apr 06 12:55:12 crc kubenswrapper[4790]: I0406 12:55:12.239606 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerDied","Data":"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785"} Apr 06 12:55:13 crc kubenswrapper[4790]: I0406 12:55:13.253695 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerStarted","Data":"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac"} Apr 06 12:55:14 crc kubenswrapper[4790]: I0406 12:55:14.280964 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9znrs" podStartSLOduration=3.448971379 podStartE2EDuration="10.280945381s" podCreationTimestamp="2026-04-06 12:55:04 +0000 UTC" firstStartedPulling="2026-04-06 12:55:06.169562642 +0000 UTC m=+3485.157305508" lastFinishedPulling="2026-04-06 12:55:13.001536614 +0000 UTC m=+3491.989279510" observedRunningTime="2026-04-06 12:55:14.278983678 +0000 UTC m=+3493.266726544" watchObservedRunningTime="2026-04-06 12:55:14.280945381 +0000 UTC m=+3493.268688237" Apr 06 12:55:14 crc kubenswrapper[4790]: I0406 12:55:14.360573 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:14 crc kubenswrapper[4790]: I0406 12:55:14.360739 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:15 crc kubenswrapper[4790]: I0406 12:55:15.407057 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9znrs" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="registry-server" probeResult="failure" output=< Apr 06 12:55:15 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 12:55:15 crc kubenswrapper[4790]: > Apr 06 12:55:24 crc kubenswrapper[4790]: I0406 12:55:24.423496 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:24 crc kubenswrapper[4790]: I0406 12:55:24.471347 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:24 crc kubenswrapper[4790]: I0406 12:55:24.660710 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:26 crc kubenswrapper[4790]: I0406 12:55:26.407050 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9znrs" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="registry-server" containerID="cri-o://75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac" gracePeriod=2 Apr 06 12:55:26 crc kubenswrapper[4790]: I0406 12:55:26.901172 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.012145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content\") pod \"0c5764d1-800e-41f9-a050-1678f4ee7936\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.012353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities\") pod \"0c5764d1-800e-41f9-a050-1678f4ee7936\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.012389 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn9mq\" (UniqueName: \"kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq\") pod \"0c5764d1-800e-41f9-a050-1678f4ee7936\" (UID: \"0c5764d1-800e-41f9-a050-1678f4ee7936\") " Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.013673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities" (OuterVolumeSpecName: "utilities") pod "0c5764d1-800e-41f9-a050-1678f4ee7936" (UID: "0c5764d1-800e-41f9-a050-1678f4ee7936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.014154 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.018858 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq" (OuterVolumeSpecName: "kube-api-access-nn9mq") pod "0c5764d1-800e-41f9-a050-1678f4ee7936" (UID: "0c5764d1-800e-41f9-a050-1678f4ee7936"). InnerVolumeSpecName "kube-api-access-nn9mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.071675 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c5764d1-800e-41f9-a050-1678f4ee7936" (UID: "0c5764d1-800e-41f9-a050-1678f4ee7936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.116233 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5764d1-800e-41f9-a050-1678f4ee7936-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.116266 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn9mq\" (UniqueName: \"kubernetes.io/projected/0c5764d1-800e-41f9-a050-1678f4ee7936-kube-api-access-nn9mq\") on node \"crc\" DevicePath \"\"" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.417642 4790 generic.go:334] "Generic (PLEG): container finished" podID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerID="75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac" exitCode=0 Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.417696 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerDied","Data":"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac"} Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.418904 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9znrs" event={"ID":"0c5764d1-800e-41f9-a050-1678f4ee7936","Type":"ContainerDied","Data":"47ede21d1e6d6558c9b16343530dfe125191890f6741655588648062e90e71a8"} Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.417720 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9znrs" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.418994 4790 scope.go:117] "RemoveContainer" containerID="75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.443672 4790 scope.go:117] "RemoveContainer" containerID="37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.501475 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.505239 4790 scope.go:117] "RemoveContainer" containerID="8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.517323 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9znrs"] Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.533709 4790 scope.go:117] "RemoveContainer" containerID="75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac" Apr 06 12:55:27 crc kubenswrapper[4790]: E0406 12:55:27.535768 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac\": container with ID starting with 75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac not found: ID does not exist" containerID="75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.535818 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac"} err="failed to get container status \"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac\": rpc error: code = NotFound desc = could not find container \"75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac\": container with ID starting with 75b679d7b69d3237d7607ddef3c65b9ea5219c75f15eb51fbe9961c752b74fac not found: ID does not exist" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.535865 4790 scope.go:117] "RemoveContainer" containerID="37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785" Apr 06 12:55:27 crc kubenswrapper[4790]: E0406 12:55:27.536773 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785\": container with ID starting with 37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785 not found: ID does not exist" containerID="37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.536795 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785"} err="failed to get container status \"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785\": rpc error: code = NotFound desc = could not find container \"37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785\": container with ID starting with 37ed6f0208b8caf46d06108db322bda8049c6d45d9a0cd612f40a33ae3ade785 not found: ID does not exist" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.536807 4790 scope.go:117] "RemoveContainer" containerID="8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90" Apr 06 12:55:27 crc kubenswrapper[4790]: E0406 12:55:27.537590 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90\": container with ID starting with 8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90 not found: ID does not exist" containerID="8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.537639 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90"} err="failed to get container status \"8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90\": rpc error: code = NotFound desc = could not find container \"8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90\": container with ID starting with 8081fb28b05670fcf349fd07fb5cb5a29f089dd847b63b67753d18feaf614f90 not found: ID does not exist" Apr 06 12:55:27 crc kubenswrapper[4790]: I0406 12:55:27.687386 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" path="/var/lib/kubelet/pods/0c5764d1-800e-41f9-a050-1678f4ee7936/volumes" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.154131 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591336-q227b"] Apr 06 12:56:00 crc kubenswrapper[4790]: E0406 12:56:00.155056 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="extract-content" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.155068 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="extract-content" Apr 06 12:56:00 crc kubenswrapper[4790]: E0406 12:56:00.155089 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="registry-server" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.155095 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="registry-server" Apr 06 12:56:00 crc kubenswrapper[4790]: E0406 12:56:00.155134 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="extract-utilities" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.155141 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="extract-utilities" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.155339 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5764d1-800e-41f9-a050-1678f4ee7936" containerName="registry-server" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.156129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.160451 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.160590 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.160588 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.164529 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591336-q227b"] Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.296554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4g2\" (UniqueName: \"kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2\") pod \"auto-csr-approver-29591336-q227b\" (UID: \"20198d9e-7649-42d5-89e9-9bd1a73f712f\") " pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.398890 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4g2\" (UniqueName: \"kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2\") pod \"auto-csr-approver-29591336-q227b\" (UID: \"20198d9e-7649-42d5-89e9-9bd1a73f712f\") " pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.419399 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4g2\" (UniqueName: \"kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2\") pod \"auto-csr-approver-29591336-q227b\" (UID: \"20198d9e-7649-42d5-89e9-9bd1a73f712f\") " pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.478610 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:00 crc kubenswrapper[4790]: I0406 12:56:00.914766 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591336-q227b"] Apr 06 12:56:01 crc kubenswrapper[4790]: I0406 12:56:01.856584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591336-q227b" event={"ID":"20198d9e-7649-42d5-89e9-9bd1a73f712f","Type":"ContainerStarted","Data":"b90b9d2bf344b4837d2a5444376cb495cc1a407d7174e34056da671e2d58ed57"} Apr 06 12:56:02 crc kubenswrapper[4790]: I0406 12:56:02.866399 4790 generic.go:334] "Generic (PLEG): container finished" podID="20198d9e-7649-42d5-89e9-9bd1a73f712f" containerID="3612c943182954b7ed86f0b570cab2e2ad5458abc9d6e783b7659e722e9b23ed" exitCode=0 Apr 06 12:56:02 crc kubenswrapper[4790]: I0406 12:56:02.866443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591336-q227b" event={"ID":"20198d9e-7649-42d5-89e9-9bd1a73f712f","Type":"ContainerDied","Data":"3612c943182954b7ed86f0b570cab2e2ad5458abc9d6e783b7659e722e9b23ed"} Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.287047 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.402642 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn4g2\" (UniqueName: \"kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2\") pod \"20198d9e-7649-42d5-89e9-9bd1a73f712f\" (UID: \"20198d9e-7649-42d5-89e9-9bd1a73f712f\") " Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.408975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2" (OuterVolumeSpecName: "kube-api-access-kn4g2") pod "20198d9e-7649-42d5-89e9-9bd1a73f712f" (UID: "20198d9e-7649-42d5-89e9-9bd1a73f712f"). InnerVolumeSpecName "kube-api-access-kn4g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.505078 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn4g2\" (UniqueName: \"kubernetes.io/projected/20198d9e-7649-42d5-89e9-9bd1a73f712f-kube-api-access-kn4g2\") on node \"crc\" DevicePath \"\"" Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.897323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591336-q227b" event={"ID":"20198d9e-7649-42d5-89e9-9bd1a73f712f","Type":"ContainerDied","Data":"b90b9d2bf344b4837d2a5444376cb495cc1a407d7174e34056da671e2d58ed57"} Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.897389 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90b9d2bf344b4837d2a5444376cb495cc1a407d7174e34056da671e2d58ed57" Apr 06 12:56:04 crc kubenswrapper[4790]: I0406 12:56:04.897471 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591336-q227b" Apr 06 12:56:05 crc kubenswrapper[4790]: I0406 12:56:05.380622 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591330-jj5gp"] Apr 06 12:56:05 crc kubenswrapper[4790]: I0406 12:56:05.391182 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591330-jj5gp"] Apr 06 12:56:05 crc kubenswrapper[4790]: I0406 12:56:05.692815 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589d79d7-98b7-4959-9918-7458631e8ef1" path="/var/lib/kubelet/pods/589d79d7-98b7-4959-9918-7458631e8ef1/volumes" Apr 06 12:56:16 crc kubenswrapper[4790]: I0406 12:56:16.920403 4790 scope.go:117] "RemoveContainer" containerID="3f73127252cc52f1af39b0afd8cdf02281bd976f59ac49b6ced1accc31d4dede" Apr 06 12:57:09 crc kubenswrapper[4790]: I0406 12:57:09.753706 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:57:09 crc kubenswrapper[4790]: I0406 12:57:09.754301 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:57:39 crc kubenswrapper[4790]: I0406 12:57:39.753868 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:57:39 crc kubenswrapper[4790]: I0406 12:57:39.754444 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.221978 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591338-bn5m5"] Apr 06 12:58:00 crc kubenswrapper[4790]: E0406 12:58:00.223277 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20198d9e-7649-42d5-89e9-9bd1a73f712f" containerName="oc" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.223299 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20198d9e-7649-42d5-89e9-9bd1a73f712f" containerName="oc" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.223629 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20198d9e-7649-42d5-89e9-9bd1a73f712f" containerName="oc" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.224677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.229221 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.229240 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.229473 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.238324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591338-bn5m5"] Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.282925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfp84\" (UniqueName: \"kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84\") pod \"auto-csr-approver-29591338-bn5m5\" (UID: \"cb2fac17-ee5c-432b-a3b4-91294ca1e304\") " pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.385106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfp84\" (UniqueName: \"kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84\") pod \"auto-csr-approver-29591338-bn5m5\" (UID: \"cb2fac17-ee5c-432b-a3b4-91294ca1e304\") " pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.425351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfp84\" (UniqueName: \"kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84\") pod \"auto-csr-approver-29591338-bn5m5\" (UID: \"cb2fac17-ee5c-432b-a3b4-91294ca1e304\") " pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:00 crc kubenswrapper[4790]: I0406 12:58:00.560119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:01 crc kubenswrapper[4790]: I0406 12:58:01.036526 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591338-bn5m5"] Apr 06 12:58:01 crc kubenswrapper[4790]: I0406 12:58:01.202144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" event={"ID":"cb2fac17-ee5c-432b-a3b4-91294ca1e304","Type":"ContainerStarted","Data":"8031a282701e093a88806bbfd764ea68ae3fb78733dbc910bd6211cb7765d21a"} Apr 06 12:58:02 crc kubenswrapper[4790]: E0406 12:58:02.752266 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb2fac17_ee5c_432b_a3b4_91294ca1e304.slice/crio-conmon-f6937211dc08b5049fd88a5a7fddb00b5252181ff99d549bd58145627d1df259.scope\": RecentStats: unable to find data in memory cache]" Apr 06 12:58:03 crc kubenswrapper[4790]: I0406 12:58:03.225525 4790 generic.go:334] "Generic (PLEG): container finished" podID="cb2fac17-ee5c-432b-a3b4-91294ca1e304" containerID="f6937211dc08b5049fd88a5a7fddb00b5252181ff99d549bd58145627d1df259" exitCode=0 Apr 06 12:58:03 crc kubenswrapper[4790]: I0406 12:58:03.225573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" event={"ID":"cb2fac17-ee5c-432b-a3b4-91294ca1e304","Type":"ContainerDied","Data":"f6937211dc08b5049fd88a5a7fddb00b5252181ff99d549bd58145627d1df259"} Apr 06 12:58:04 crc kubenswrapper[4790]: I0406 12:58:04.704319 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:04 crc kubenswrapper[4790]: I0406 12:58:04.791862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfp84\" (UniqueName: \"kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84\") pod \"cb2fac17-ee5c-432b-a3b4-91294ca1e304\" (UID: \"cb2fac17-ee5c-432b-a3b4-91294ca1e304\") " Apr 06 12:58:04 crc kubenswrapper[4790]: I0406 12:58:04.822062 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84" (OuterVolumeSpecName: "kube-api-access-vfp84") pod "cb2fac17-ee5c-432b-a3b4-91294ca1e304" (UID: "cb2fac17-ee5c-432b-a3b4-91294ca1e304"). InnerVolumeSpecName "kube-api-access-vfp84". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 12:58:04 crc kubenswrapper[4790]: I0406 12:58:04.896642 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfp84\" (UniqueName: \"kubernetes.io/projected/cb2fac17-ee5c-432b-a3b4-91294ca1e304-kube-api-access-vfp84\") on node \"crc\" DevicePath \"\"" Apr 06 12:58:05 crc kubenswrapper[4790]: I0406 12:58:05.253805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" event={"ID":"cb2fac17-ee5c-432b-a3b4-91294ca1e304","Type":"ContainerDied","Data":"8031a282701e093a88806bbfd764ea68ae3fb78733dbc910bd6211cb7765d21a"} Apr 06 12:58:05 crc kubenswrapper[4790]: I0406 12:58:05.253903 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8031a282701e093a88806bbfd764ea68ae3fb78733dbc910bd6211cb7765d21a" Apr 06 12:58:05 crc kubenswrapper[4790]: I0406 12:58:05.254059 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591338-bn5m5" Apr 06 12:58:05 crc kubenswrapper[4790]: I0406 12:58:05.798762 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591332-fbpvh"] Apr 06 12:58:05 crc kubenswrapper[4790]: I0406 12:58:05.808066 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591332-fbpvh"] Apr 06 12:58:07 crc kubenswrapper[4790]: I0406 12:58:07.689699 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1809bcc-7651-4043-a338-e0f0d5bcf115" path="/var/lib/kubelet/pods/d1809bcc-7651-4043-a338-e0f0d5bcf115/volumes" Apr 06 12:58:09 crc kubenswrapper[4790]: I0406 12:58:09.753982 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 12:58:09 crc kubenswrapper[4790]: I0406 12:58:09.754486 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 12:58:09 crc kubenswrapper[4790]: I0406 12:58:09.754580 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 12:58:09 crc kubenswrapper[4790]: I0406 12:58:09.755745 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 12:58:09 crc kubenswrapper[4790]: I0406 12:58:09.755878 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a" gracePeriod=600 Apr 06 12:58:10 crc kubenswrapper[4790]: I0406 12:58:10.303396 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a" exitCode=0 Apr 06 12:58:10 crc kubenswrapper[4790]: I0406 12:58:10.303464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a"} Apr 06 12:58:10 crc kubenswrapper[4790]: I0406 12:58:10.303910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba"} Apr 06 12:58:10 crc kubenswrapper[4790]: I0406 12:58:10.303953 4790 scope.go:117] "RemoveContainer" containerID="4dce85cfa8e77b5a028d2fb43ff898b4ec459385449d978591ed38ece4e70ef4" Apr 06 12:58:17 crc kubenswrapper[4790]: I0406 12:58:17.052156 4790 scope.go:117] "RemoveContainer" containerID="519b03e39ed8835d8031a2634b1c0ab325bec95d9f706006e4dca0fa3dd3ff80" Apr 06 12:59:41 crc kubenswrapper[4790]: I0406 12:59:41.904076 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="cd529dba-04e1-45bf-9a0a-69fd93502cd9" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.1.35:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.177635 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591340-9zdcs"] Apr 06 13:00:00 crc kubenswrapper[4790]: E0406 13:00:00.178773 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2fac17-ee5c-432b-a3b4-91294ca1e304" containerName="oc" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.178789 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2fac17-ee5c-432b-a3b4-91294ca1e304" containerName="oc" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.179079 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb2fac17-ee5c-432b-a3b4-91294ca1e304" containerName="oc" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.179938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.183223 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.183809 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.184005 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.191779 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp"] Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.193676 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.195721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.196224 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.201683 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591340-9zdcs"] Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.216373 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp"] Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.376291 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.376601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn444\" (UniqueName: \"kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444\") pod \"auto-csr-approver-29591340-9zdcs\" (UID: \"543fadab-d27e-43d4-accd-0d3259e3f783\") " pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.376646 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.376850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq5d\" (UniqueName: \"kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.479240 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq5d\" (UniqueName: \"kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.479783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.479941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn444\" (UniqueName: \"kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444\") pod \"auto-csr-approver-29591340-9zdcs\" (UID: \"543fadab-d27e-43d4-accd-0d3259e3f783\") " pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.480114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.481318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.487799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.504486 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq5d\" (UniqueName: \"kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d\") pod \"collect-profiles-29591340-grtzp\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.511343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.516479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn444\" (UniqueName: \"kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444\") pod \"auto-csr-approver-29591340-9zdcs\" (UID: \"543fadab-d27e-43d4-accd-0d3259e3f783\") " pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:00 crc kubenswrapper[4790]: I0406 13:00:00.796758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.000972 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp"] Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.353267 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591340-9zdcs"] Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.510895 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f69b634-b80e-4537-87d4-c5b827de18ac" containerID="2bac2d1c345e91a26f90680ea25b8265f8c958cad66911048b3ff183ecf648fe" exitCode=0 Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.511049 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" event={"ID":"2f69b634-b80e-4537-87d4-c5b827de18ac","Type":"ContainerDied","Data":"2bac2d1c345e91a26f90680ea25b8265f8c958cad66911048b3ff183ecf648fe"} Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.511107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" event={"ID":"2f69b634-b80e-4537-87d4-c5b827de18ac","Type":"ContainerStarted","Data":"37fe7e688044fd26b3735516fa684ba3a3b42cfd434f6cbaba8b8896e29334d0"} Apr 06 13:00:01 crc kubenswrapper[4790]: I0406 13:00:01.513002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" event={"ID":"543fadab-d27e-43d4-accd-0d3259e3f783","Type":"ContainerStarted","Data":"83b344acba090d5712088d1f07091f44ad7556dfbad1fa66f43559d209ccf74a"} Apr 06 13:00:02 crc kubenswrapper[4790]: I0406 13:00:02.929655 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.038765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzq5d\" (UniqueName: \"kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d\") pod \"2f69b634-b80e-4537-87d4-c5b827de18ac\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.038908 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume\") pod \"2f69b634-b80e-4537-87d4-c5b827de18ac\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.039020 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume\") pod \"2f69b634-b80e-4537-87d4-c5b827de18ac\" (UID: \"2f69b634-b80e-4537-87d4-c5b827de18ac\") " Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.039427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f69b634-b80e-4537-87d4-c5b827de18ac" (UID: "2f69b634-b80e-4537-87d4-c5b827de18ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.039574 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f69b634-b80e-4537-87d4-c5b827de18ac-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.044667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f69b634-b80e-4537-87d4-c5b827de18ac" (UID: "2f69b634-b80e-4537-87d4-c5b827de18ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.045365 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d" (OuterVolumeSpecName: "kube-api-access-rzq5d") pod "2f69b634-b80e-4537-87d4-c5b827de18ac" (UID: "2f69b634-b80e-4537-87d4-c5b827de18ac"). InnerVolumeSpecName "kube-api-access-rzq5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.142118 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzq5d\" (UniqueName: \"kubernetes.io/projected/2f69b634-b80e-4537-87d4-c5b827de18ac-kube-api-access-rzq5d\") on node \"crc\" DevicePath \"\"" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.142177 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f69b634-b80e-4537-87d4-c5b827de18ac-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.532678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" event={"ID":"2f69b634-b80e-4537-87d4-c5b827de18ac","Type":"ContainerDied","Data":"37fe7e688044fd26b3735516fa684ba3a3b42cfd434f6cbaba8b8896e29334d0"} Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.532977 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fe7e688044fd26b3735516fa684ba3a3b42cfd434f6cbaba8b8896e29334d0" Apr 06 13:00:03 crc kubenswrapper[4790]: I0406 13:00:03.532722 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp" Apr 06 13:00:04 crc kubenswrapper[4790]: I0406 13:00:04.007105 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv"] Apr 06 13:00:04 crc kubenswrapper[4790]: I0406 13:00:04.026164 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591295-h9xmv"] Apr 06 13:00:05 crc kubenswrapper[4790]: I0406 13:00:05.702184 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d86720-ff28-4773-966f-21968cb6d7f4" path="/var/lib/kubelet/pods/f8d86720-ff28-4773-966f-21968cb6d7f4/volumes" Apr 06 13:00:17 crc kubenswrapper[4790]: I0406 13:00:17.173954 4790 scope.go:117] "RemoveContainer" containerID="c7251a5006b8db817acb0c3310f4fdc9420ce0adab47122519f0bab267d1de83" Apr 06 13:00:39 crc kubenswrapper[4790]: I0406 13:00:39.753227 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:00:39 crc kubenswrapper[4790]: I0406 13:00:39.753724 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:00:47 crc kubenswrapper[4790]: I0406 13:00:47.998760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" event={"ID":"543fadab-d27e-43d4-accd-0d3259e3f783","Type":"ContainerStarted","Data":"492a93173d79e42647804670e4f4a3015f760cf96a3e09b9765e438b993f5ece"} Apr 06 13:00:48 crc kubenswrapper[4790]: I0406 13:00:48.018015 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" podStartSLOduration=1.72443998 podStartE2EDuration="48.017996398s" podCreationTimestamp="2026-04-06 13:00:00 +0000 UTC" firstStartedPulling="2026-04-06 13:00:01.367581178 +0000 UTC m=+3780.355324044" lastFinishedPulling="2026-04-06 13:00:47.661137606 +0000 UTC m=+3826.648880462" observedRunningTime="2026-04-06 13:00:48.010639161 +0000 UTC m=+3826.998382037" watchObservedRunningTime="2026-04-06 13:00:48.017996398 +0000 UTC m=+3827.005739254" Apr 06 13:00:49 crc kubenswrapper[4790]: I0406 13:00:49.010879 4790 generic.go:334] "Generic (PLEG): container finished" podID="543fadab-d27e-43d4-accd-0d3259e3f783" containerID="492a93173d79e42647804670e4f4a3015f760cf96a3e09b9765e438b993f5ece" exitCode=0 Apr 06 13:00:49 crc kubenswrapper[4790]: I0406 13:00:49.010964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" event={"ID":"543fadab-d27e-43d4-accd-0d3259e3f783","Type":"ContainerDied","Data":"492a93173d79e42647804670e4f4a3015f760cf96a3e09b9765e438b993f5ece"} Apr 06 13:00:50 crc kubenswrapper[4790]: I0406 13:00:50.467878 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:50 crc kubenswrapper[4790]: I0406 13:00:50.543212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn444\" (UniqueName: \"kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444\") pod \"543fadab-d27e-43d4-accd-0d3259e3f783\" (UID: \"543fadab-d27e-43d4-accd-0d3259e3f783\") " Apr 06 13:00:50 crc kubenswrapper[4790]: I0406 13:00:50.549493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444" (OuterVolumeSpecName: "kube-api-access-nn444") pod "543fadab-d27e-43d4-accd-0d3259e3f783" (UID: "543fadab-d27e-43d4-accd-0d3259e3f783"). InnerVolumeSpecName "kube-api-access-nn444". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:00:50 crc kubenswrapper[4790]: I0406 13:00:50.646390 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn444\" (UniqueName: \"kubernetes.io/projected/543fadab-d27e-43d4-accd-0d3259e3f783-kube-api-access-nn444\") on node \"crc\" DevicePath \"\"" Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.039366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" event={"ID":"543fadab-d27e-43d4-accd-0d3259e3f783","Type":"ContainerDied","Data":"83b344acba090d5712088d1f07091f44ad7556dfbad1fa66f43559d209ccf74a"} Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.039645 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b344acba090d5712088d1f07091f44ad7556dfbad1fa66f43559d209ccf74a" Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.039499 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591340-9zdcs" Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.093758 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591334-8whrg"] Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.108485 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591334-8whrg"] Apr 06 13:00:51 crc kubenswrapper[4790]: I0406 13:00:51.687369 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e7b2ae-dcb3-4889-b512-be52cf460f33" path="/var/lib/kubelet/pods/d2e7b2ae-dcb3-4889-b512-be52cf460f33/volumes" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.168785 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29591341-jdpjh"] Apr 06 13:01:00 crc kubenswrapper[4790]: E0406 13:01:00.169927 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f69b634-b80e-4537-87d4-c5b827de18ac" containerName="collect-profiles" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.169948 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f69b634-b80e-4537-87d4-c5b827de18ac" containerName="collect-profiles" Apr 06 13:01:00 crc kubenswrapper[4790]: E0406 13:01:00.169974 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543fadab-d27e-43d4-accd-0d3259e3f783" containerName="oc" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.169984 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="543fadab-d27e-43d4-accd-0d3259e3f783" containerName="oc" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.170310 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="543fadab-d27e-43d4-accd-0d3259e3f783" containerName="oc" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.170333 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f69b634-b80e-4537-87d4-c5b827de18ac" containerName="collect-profiles" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.171480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.184286 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29591341-jdpjh"] Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.241661 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.241794 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.241896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbmp\" (UniqueName: \"kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.242064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.344056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.344132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.344171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbmp\" (UniqueName: \"kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.344231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.353279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.355507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.356016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.367184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbmp\" (UniqueName: \"kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp\") pod \"keystone-cron-29591341-jdpjh\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.504143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:00 crc kubenswrapper[4790]: I0406 13:01:00.990279 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29591341-jdpjh"] Apr 06 13:01:01 crc kubenswrapper[4790]: I0406 13:01:01.148813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29591341-jdpjh" event={"ID":"7657bf4f-84d6-4cc0-97da-ac70e2aa07de","Type":"ContainerStarted","Data":"6d90f56b0093b47ba80d13daace29d99baf6f0eb660fafcd9547d57978ea62ca"} Apr 06 13:01:02 crc kubenswrapper[4790]: I0406 13:01:02.161111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29591341-jdpjh" event={"ID":"7657bf4f-84d6-4cc0-97da-ac70e2aa07de","Type":"ContainerStarted","Data":"95fdb0bfc69dddf62811f0c31b66c2a26f3cc40506db08f98a337d16b9555aad"} Apr 06 13:01:02 crc kubenswrapper[4790]: I0406 13:01:02.183512 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29591341-jdpjh" podStartSLOduration=2.183494364 podStartE2EDuration="2.183494364s" podCreationTimestamp="2026-04-06 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 13:01:02.178726697 +0000 UTC m=+3841.166469573" watchObservedRunningTime="2026-04-06 13:01:02.183494364 +0000 UTC m=+3841.171237230" Apr 06 13:01:05 crc kubenswrapper[4790]: I0406 13:01:05.189067 4790 generic.go:334] "Generic (PLEG): container finished" podID="7657bf4f-84d6-4cc0-97da-ac70e2aa07de" containerID="95fdb0bfc69dddf62811f0c31b66c2a26f3cc40506db08f98a337d16b9555aad" exitCode=0 Apr 06 13:01:05 crc kubenswrapper[4790]: I0406 13:01:05.189159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29591341-jdpjh" event={"ID":"7657bf4f-84d6-4cc0-97da-ac70e2aa07de","Type":"ContainerDied","Data":"95fdb0bfc69dddf62811f0c31b66c2a26f3cc40506db08f98a337d16b9555aad"} Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.635068 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.799407 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys\") pod \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.799518 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle\") pod \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.799575 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data\") pod \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.799776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbmp\" (UniqueName: \"kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp\") pod \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\" (UID: \"7657bf4f-84d6-4cc0-97da-ac70e2aa07de\") " Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.807726 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp" (OuterVolumeSpecName: "kube-api-access-jdbmp") pod "7657bf4f-84d6-4cc0-97da-ac70e2aa07de" (UID: "7657bf4f-84d6-4cc0-97da-ac70e2aa07de"). InnerVolumeSpecName "kube-api-access-jdbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.809961 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7657bf4f-84d6-4cc0-97da-ac70e2aa07de" (UID: "7657bf4f-84d6-4cc0-97da-ac70e2aa07de"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.845760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7657bf4f-84d6-4cc0-97da-ac70e2aa07de" (UID: "7657bf4f-84d6-4cc0-97da-ac70e2aa07de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.870060 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data" (OuterVolumeSpecName: "config-data") pod "7657bf4f-84d6-4cc0-97da-ac70e2aa07de" (UID: "7657bf4f-84d6-4cc0-97da-ac70e2aa07de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.902690 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbmp\" (UniqueName: \"kubernetes.io/projected/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-kube-api-access-jdbmp\") on node \"crc\" DevicePath \"\"" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.902725 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-fernet-keys\") on node \"crc\" DevicePath \"\"" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.902736 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Apr 06 13:01:06 crc kubenswrapper[4790]: I0406 13:01:06.902744 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bf4f-84d6-4cc0-97da-ac70e2aa07de-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 13:01:07 crc kubenswrapper[4790]: I0406 13:01:07.212016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29591341-jdpjh" event={"ID":"7657bf4f-84d6-4cc0-97da-ac70e2aa07de","Type":"ContainerDied","Data":"6d90f56b0093b47ba80d13daace29d99baf6f0eb660fafcd9547d57978ea62ca"} Apr 06 13:01:07 crc kubenswrapper[4790]: I0406 13:01:07.212062 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d90f56b0093b47ba80d13daace29d99baf6f0eb660fafcd9547d57978ea62ca" Apr 06 13:01:07 crc kubenswrapper[4790]: I0406 13:01:07.212081 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29591341-jdpjh" Apr 06 13:01:09 crc kubenswrapper[4790]: I0406 13:01:09.753652 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:01:09 crc kubenswrapper[4790]: I0406 13:01:09.754051 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:01:17 crc kubenswrapper[4790]: I0406 13:01:17.249872 4790 scope.go:117] "RemoveContainer" containerID="9e0ab845231065729c631421c49f11a6c13a33755a088e90b84e2fbefbd06c21" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.753543 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.754368 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.754421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.755359 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.755463 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" gracePeriod=600 Apr 06 13:01:39 crc kubenswrapper[4790]: E0406 13:01:39.881839 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.912057 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" exitCode=0 Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.912106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba"} Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.912142 4790 scope.go:117] "RemoveContainer" containerID="d9097e1e91624dbbe1a443b99532574be113bdc06f0804576373fd2fb2754d2a" Apr 06 13:01:39 crc kubenswrapper[4790]: I0406 13:01:39.912817 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:01:39 crc kubenswrapper[4790]: E0406 13:01:39.913092 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:01:54 crc kubenswrapper[4790]: I0406 13:01:54.675675 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:01:54 crc kubenswrapper[4790]: E0406 13:01:54.676784 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.150045 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591342-hj2fd"] Apr 06 13:02:00 crc kubenswrapper[4790]: E0406 13:02:00.151890 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7657bf4f-84d6-4cc0-97da-ac70e2aa07de" containerName="keystone-cron" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.151981 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7657bf4f-84d6-4cc0-97da-ac70e2aa07de" containerName="keystone-cron" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.152268 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7657bf4f-84d6-4cc0-97da-ac70e2aa07de" containerName="keystone-cron" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.153024 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.154970 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.155178 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.155220 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.159183 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591342-hj2fd"] Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.295087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvtqt\" (UniqueName: \"kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt\") pod \"auto-csr-approver-29591342-hj2fd\" (UID: \"de6fdc37-78cb-4947-8493-7633660b6c40\") " pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.397198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvtqt\" (UniqueName: \"kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt\") pod \"auto-csr-approver-29591342-hj2fd\" (UID: \"de6fdc37-78cb-4947-8493-7633660b6c40\") " pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.420574 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvtqt\" (UniqueName: \"kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt\") pod \"auto-csr-approver-29591342-hj2fd\" (UID: \"de6fdc37-78cb-4947-8493-7633660b6c40\") " pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.473739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.940181 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591342-hj2fd"] Apr 06 13:02:00 crc kubenswrapper[4790]: I0406 13:02:00.947485 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:02:01 crc kubenswrapper[4790]: I0406 13:02:01.138003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" event={"ID":"de6fdc37-78cb-4947-8493-7633660b6c40","Type":"ContainerStarted","Data":"ab01c03d2bc1f51f5deb88ef30d1fcdb63ee25fd4c4ebe55420704fde10dba95"} Apr 06 13:02:02 crc kubenswrapper[4790]: I0406 13:02:02.148893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" event={"ID":"de6fdc37-78cb-4947-8493-7633660b6c40","Type":"ContainerStarted","Data":"1d1869fe96a8272933b4653c63b22e5e586dfe1fe7942c4e9a249f727886d3fe"} Apr 06 13:02:02 crc kubenswrapper[4790]: I0406 13:02:02.166196 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" podStartSLOduration=1.364772498 podStartE2EDuration="2.16617187s" podCreationTimestamp="2026-04-06 13:02:00 +0000 UTC" firstStartedPulling="2026-04-06 13:02:00.946157778 +0000 UTC m=+3899.933900684" lastFinishedPulling="2026-04-06 13:02:01.74755719 +0000 UTC m=+3900.735300056" observedRunningTime="2026-04-06 13:02:02.161086424 +0000 UTC m=+3901.148829300" watchObservedRunningTime="2026-04-06 13:02:02.16617187 +0000 UTC m=+3901.153914736" Apr 06 13:02:03 crc kubenswrapper[4790]: I0406 13:02:03.162221 4790 generic.go:334] "Generic (PLEG): container finished" podID="de6fdc37-78cb-4947-8493-7633660b6c40" containerID="1d1869fe96a8272933b4653c63b22e5e586dfe1fe7942c4e9a249f727886d3fe" exitCode=0 Apr 06 13:02:03 crc kubenswrapper[4790]: I0406 13:02:03.162275 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" event={"ID":"de6fdc37-78cb-4947-8493-7633660b6c40","Type":"ContainerDied","Data":"1d1869fe96a8272933b4653c63b22e5e586dfe1fe7942c4e9a249f727886d3fe"} Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.107440 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.110183 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.122592 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.283455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.283788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7sv\" (UniqueName: \"kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.284043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.386521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.386642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.386669 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7sv\" (UniqueName: \"kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.387238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.387343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.416861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7sv\" (UniqueName: \"kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv\") pod \"certified-operators-5zf75\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.476247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.753040 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.906371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvtqt\" (UniqueName: \"kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt\") pod \"de6fdc37-78cb-4947-8493-7633660b6c40\" (UID: \"de6fdc37-78cb-4947-8493-7633660b6c40\") " Apr 06 13:02:04 crc kubenswrapper[4790]: I0406 13:02:04.914915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt" (OuterVolumeSpecName: "kube-api-access-qvtqt") pod "de6fdc37-78cb-4947-8493-7633660b6c40" (UID: "de6fdc37-78cb-4947-8493-7633660b6c40"). InnerVolumeSpecName "kube-api-access-qvtqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.010020 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvtqt\" (UniqueName: \"kubernetes.io/projected/de6fdc37-78cb-4947-8493-7633660b6c40-kube-api-access-qvtqt\") on node \"crc\" DevicePath \"\"" Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.083902 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.191076 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerStarted","Data":"f019551b665608bade5c41bbf3f46ee18705381c2a42f80e4a6264bb5fcfed34"} Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.192564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" event={"ID":"de6fdc37-78cb-4947-8493-7633660b6c40","Type":"ContainerDied","Data":"ab01c03d2bc1f51f5deb88ef30d1fcdb63ee25fd4c4ebe55420704fde10dba95"} Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.192585 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab01c03d2bc1f51f5deb88ef30d1fcdb63ee25fd4c4ebe55420704fde10dba95" Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.192630 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591342-hj2fd" Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.676650 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:02:05 crc kubenswrapper[4790]: E0406 13:02:05.677801 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.840806 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591336-q227b"] Apr 06 13:02:05 crc kubenswrapper[4790]: I0406 13:02:05.849687 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591336-q227b"] Apr 06 13:02:06 crc kubenswrapper[4790]: I0406 13:02:06.205221 4790 generic.go:334] "Generic (PLEG): container finished" podID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerID="b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c" exitCode=0 Apr 06 13:02:06 crc kubenswrapper[4790]: I0406 13:02:06.205262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerDied","Data":"b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c"} Apr 06 13:02:07 crc kubenswrapper[4790]: I0406 13:02:07.217867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerStarted","Data":"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569"} Apr 06 13:02:07 crc kubenswrapper[4790]: I0406 13:02:07.692292 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20198d9e-7649-42d5-89e9-9bd1a73f712f" path="/var/lib/kubelet/pods/20198d9e-7649-42d5-89e9-9bd1a73f712f/volumes" Apr 06 13:02:09 crc kubenswrapper[4790]: I0406 13:02:09.250487 4790 generic.go:334] "Generic (PLEG): container finished" podID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerID="367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569" exitCode=0 Apr 06 13:02:09 crc kubenswrapper[4790]: I0406 13:02:09.250602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerDied","Data":"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569"} Apr 06 13:02:10 crc kubenswrapper[4790]: I0406 13:02:10.268727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerStarted","Data":"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394"} Apr 06 13:02:10 crc kubenswrapper[4790]: I0406 13:02:10.300417 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5zf75" podStartSLOduration=2.888078511 podStartE2EDuration="6.30039435s" podCreationTimestamp="2026-04-06 13:02:04 +0000 UTC" firstStartedPulling="2026-04-06 13:02:06.208082571 +0000 UTC m=+3905.195825437" lastFinishedPulling="2026-04-06 13:02:09.62039841 +0000 UTC m=+3908.608141276" observedRunningTime="2026-04-06 13:02:10.291573394 +0000 UTC m=+3909.279316260" watchObservedRunningTime="2026-04-06 13:02:10.30039435 +0000 UTC m=+3909.288137216" Apr 06 13:02:14 crc kubenswrapper[4790]: I0406 13:02:14.476735 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:14 crc kubenswrapper[4790]: I0406 13:02:14.477480 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:14 crc kubenswrapper[4790]: I0406 13:02:14.529520 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:15 crc kubenswrapper[4790]: I0406 13:02:15.410216 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:15 crc kubenswrapper[4790]: I0406 13:02:15.478107 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:16 crc kubenswrapper[4790]: I0406 13:02:16.675882 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:02:16 crc kubenswrapper[4790]: E0406 13:02:16.677165 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:02:17 crc kubenswrapper[4790]: I0406 13:02:17.345707 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5zf75" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="registry-server" containerID="cri-o://66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394" gracePeriod=2 Apr 06 13:02:17 crc kubenswrapper[4790]: I0406 13:02:17.366988 4790 scope.go:117] "RemoveContainer" containerID="3612c943182954b7ed86f0b570cab2e2ad5458abc9d6e783b7659e722e9b23ed" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.008127 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.193145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content\") pod \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.193560 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities\") pod \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.193703 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw7sv\" (UniqueName: \"kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv\") pod \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\" (UID: \"7c3d555b-9cb3-4358-80dc-dbe8b80f409b\") " Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.194224 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities" (OuterVolumeSpecName: "utilities") pod "7c3d555b-9cb3-4358-80dc-dbe8b80f409b" (UID: "7c3d555b-9cb3-4358-80dc-dbe8b80f409b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.194612 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.205024 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv" (OuterVolumeSpecName: "kube-api-access-tw7sv") pod "7c3d555b-9cb3-4358-80dc-dbe8b80f409b" (UID: "7c3d555b-9cb3-4358-80dc-dbe8b80f409b"). InnerVolumeSpecName "kube-api-access-tw7sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.244275 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c3d555b-9cb3-4358-80dc-dbe8b80f409b" (UID: "7c3d555b-9cb3-4358-80dc-dbe8b80f409b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.296436 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.296676 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw7sv\" (UniqueName: \"kubernetes.io/projected/7c3d555b-9cb3-4358-80dc-dbe8b80f409b-kube-api-access-tw7sv\") on node \"crc\" DevicePath \"\"" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.356231 4790 generic.go:334] "Generic (PLEG): container finished" podID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerID="66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394" exitCode=0 Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.356285 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zf75" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.356284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerDied","Data":"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394"} Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.356434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zf75" event={"ID":"7c3d555b-9cb3-4358-80dc-dbe8b80f409b","Type":"ContainerDied","Data":"f019551b665608bade5c41bbf3f46ee18705381c2a42f80e4a6264bb5fcfed34"} Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.356469 4790 scope.go:117] "RemoveContainer" containerID="66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.389111 4790 scope.go:117] "RemoveContainer" containerID="367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.398626 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.413468 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5zf75"] Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.425522 4790 scope.go:117] "RemoveContainer" containerID="b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.470033 4790 scope.go:117] "RemoveContainer" containerID="66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394" Apr 06 13:02:18 crc kubenswrapper[4790]: E0406 13:02:18.470493 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394\": container with ID starting with 66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394 not found: ID does not exist" containerID="66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.470618 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394"} err="failed to get container status \"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394\": rpc error: code = NotFound desc = could not find container \"66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394\": container with ID starting with 66ef03a19ba62279f8aa6dd0f0d13784f219d3e997616022567050790404a394 not found: ID does not exist" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.470715 4790 scope.go:117] "RemoveContainer" containerID="367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569" Apr 06 13:02:18 crc kubenswrapper[4790]: E0406 13:02:18.471068 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569\": container with ID starting with 367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569 not found: ID does not exist" containerID="367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.471090 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569"} err="failed to get container status \"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569\": rpc error: code = NotFound desc = could not find container \"367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569\": container with ID starting with 367b47f8aaefa53a11465760608c0316e5b5e2f873a913175d00507ba122c569 not found: ID does not exist" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.471105 4790 scope.go:117] "RemoveContainer" containerID="b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c" Apr 06 13:02:18 crc kubenswrapper[4790]: E0406 13:02:18.471352 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c\": container with ID starting with b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c not found: ID does not exist" containerID="b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c" Apr 06 13:02:18 crc kubenswrapper[4790]: I0406 13:02:18.471380 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c"} err="failed to get container status \"b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c\": rpc error: code = NotFound desc = could not find container \"b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c\": container with ID starting with b1977daa54555e45c510ad1ae0597f145432535ef87f05b2e1e4d8165a8bb41c not found: ID does not exist" Apr 06 13:02:19 crc kubenswrapper[4790]: I0406 13:02:19.692638 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" path="/var/lib/kubelet/pods/7c3d555b-9cb3-4358-80dc-dbe8b80f409b/volumes" Apr 06 13:02:30 crc kubenswrapper[4790]: I0406 13:02:30.675937 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:02:30 crc kubenswrapper[4790]: E0406 13:02:30.676621 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:02:45 crc kubenswrapper[4790]: I0406 13:02:45.676921 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:02:45 crc kubenswrapper[4790]: E0406 13:02:45.677869 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:02:58 crc kubenswrapper[4790]: I0406 13:02:58.676297 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:02:58 crc kubenswrapper[4790]: E0406 13:02:58.677146 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.595755 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:03 crc kubenswrapper[4790]: E0406 13:03:03.597102 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="registry-server" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597120 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="registry-server" Apr 06 13:03:03 crc kubenswrapper[4790]: E0406 13:03:03.597157 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6fdc37-78cb-4947-8493-7633660b6c40" containerName="oc" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597165 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6fdc37-78cb-4947-8493-7633660b6c40" containerName="oc" Apr 06 13:03:03 crc kubenswrapper[4790]: E0406 13:03:03.597195 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="extract-content" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597202 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="extract-content" Apr 06 13:03:03 crc kubenswrapper[4790]: E0406 13:03:03.597213 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="extract-utilities" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597223 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="extract-utilities" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597482 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3d555b-9cb3-4358-80dc-dbe8b80f409b" containerName="registry-server" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.597494 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6fdc37-78cb-4947-8493-7633660b6c40" containerName="oc" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.599456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.607527 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.690424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.690491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6l2\" (UniqueName: \"kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.690549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.793301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.793481 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6l2\" (UniqueName: \"kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.794056 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.794318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.794894 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.824252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6l2\" (UniqueName: \"kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2\") pod \"redhat-operators-wjjg5\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:03 crc kubenswrapper[4790]: I0406 13:03:03.922182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:04 crc kubenswrapper[4790]: I0406 13:03:04.460643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:04 crc kubenswrapper[4790]: I0406 13:03:04.870406 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerID="f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f" exitCode=0 Apr 06 13:03:04 crc kubenswrapper[4790]: I0406 13:03:04.870708 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerDied","Data":"f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f"} Apr 06 13:03:04 crc kubenswrapper[4790]: I0406 13:03:04.870738 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerStarted","Data":"b56e3b68a302d68124a2b4f448aba1e3fa6e85570cc43291eb396d46f2072ac7"} Apr 06 13:03:06 crc kubenswrapper[4790]: I0406 13:03:06.897589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerStarted","Data":"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e"} Apr 06 13:03:10 crc kubenswrapper[4790]: I0406 13:03:10.932516 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerID="a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e" exitCode=0 Apr 06 13:03:10 crc kubenswrapper[4790]: I0406 13:03:10.932584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerDied","Data":"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e"} Apr 06 13:03:11 crc kubenswrapper[4790]: I0406 13:03:11.949673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerStarted","Data":"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98"} Apr 06 13:03:11 crc kubenswrapper[4790]: I0406 13:03:11.978449 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wjjg5" podStartSLOduration=2.552739328 podStartE2EDuration="8.97843139s" podCreationTimestamp="2026-04-06 13:03:03 +0000 UTC" firstStartedPulling="2026-04-06 13:03:04.872496401 +0000 UTC m=+3963.860239267" lastFinishedPulling="2026-04-06 13:03:11.298188463 +0000 UTC m=+3970.285931329" observedRunningTime="2026-04-06 13:03:11.969357978 +0000 UTC m=+3970.957100864" watchObservedRunningTime="2026-04-06 13:03:11.97843139 +0000 UTC m=+3970.966174246" Apr 06 13:03:13 crc kubenswrapper[4790]: I0406 13:03:13.675916 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:03:13 crc kubenswrapper[4790]: E0406 13:03:13.676416 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:03:13 crc kubenswrapper[4790]: I0406 13:03:13.923271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:13 crc kubenswrapper[4790]: I0406 13:03:13.923382 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:14 crc kubenswrapper[4790]: I0406 13:03:14.980090 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wjjg5" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="registry-server" probeResult="failure" output=< Apr 06 13:03:14 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:03:14 crc kubenswrapper[4790]: > Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.436879 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.442324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.459543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.574394 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.574458 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.574619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c28l5\" (UniqueName: \"kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.676002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c28l5\" (UniqueName: \"kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.676329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.676454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.676986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.677012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.706205 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c28l5\" (UniqueName: \"kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5\") pod \"redhat-marketplace-xzgjh\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:19 crc kubenswrapper[4790]: I0406 13:03:19.768235 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:20 crc kubenswrapper[4790]: I0406 13:03:20.316273 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:21 crc kubenswrapper[4790]: I0406 13:03:21.046207 4790 generic.go:334] "Generic (PLEG): container finished" podID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerID="2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e" exitCode=0 Apr 06 13:03:21 crc kubenswrapper[4790]: I0406 13:03:21.046510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerDied","Data":"2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e"} Apr 06 13:03:21 crc kubenswrapper[4790]: I0406 13:03:21.046535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerStarted","Data":"3d2dd218cafc7bf717a23eb8a5ffb4267d55706a5c972dbcfa0fa53becc9d1dd"} Apr 06 13:03:22 crc kubenswrapper[4790]: I0406 13:03:22.068625 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerStarted","Data":"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336"} Apr 06 13:03:23 crc kubenswrapper[4790]: I0406 13:03:23.092151 4790 generic.go:334] "Generic (PLEG): container finished" podID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerID="01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336" exitCode=0 Apr 06 13:03:23 crc kubenswrapper[4790]: I0406 13:03:23.092220 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerDied","Data":"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336"} Apr 06 13:03:23 crc kubenswrapper[4790]: I0406 13:03:23.975261 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:24 crc kubenswrapper[4790]: I0406 13:03:24.027544 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:24 crc kubenswrapper[4790]: I0406 13:03:24.106558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerStarted","Data":"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c"} Apr 06 13:03:24 crc kubenswrapper[4790]: I0406 13:03:24.130578 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xzgjh" podStartSLOduration=2.695261966 podStartE2EDuration="5.130527472s" podCreationTimestamp="2026-04-06 13:03:19 +0000 UTC" firstStartedPulling="2026-04-06 13:03:21.048182405 +0000 UTC m=+3980.035925271" lastFinishedPulling="2026-04-06 13:03:23.483447911 +0000 UTC m=+3982.471190777" observedRunningTime="2026-04-06 13:03:24.123463053 +0000 UTC m=+3983.111205919" watchObservedRunningTime="2026-04-06 13:03:24.130527472 +0000 UTC m=+3983.118270338" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.224190 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.224687 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wjjg5" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="registry-server" containerID="cri-o://740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98" gracePeriod=2 Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.675600 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:03:26 crc kubenswrapper[4790]: E0406 13:03:26.676065 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.722385 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.762625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities\") pod \"9e3907de-aa66-4764-ba08-54e9c53c052d\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.762675 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content\") pod \"9e3907de-aa66-4764-ba08-54e9c53c052d\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.762715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs6l2\" (UniqueName: \"kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2\") pod \"9e3907de-aa66-4764-ba08-54e9c53c052d\" (UID: \"9e3907de-aa66-4764-ba08-54e9c53c052d\") " Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.764739 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities" (OuterVolumeSpecName: "utilities") pod "9e3907de-aa66-4764-ba08-54e9c53c052d" (UID: "9e3907de-aa66-4764-ba08-54e9c53c052d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.796394 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2" (OuterVolumeSpecName: "kube-api-access-zs6l2") pod "9e3907de-aa66-4764-ba08-54e9c53c052d" (UID: "9e3907de-aa66-4764-ba08-54e9c53c052d"). InnerVolumeSpecName "kube-api-access-zs6l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.865260 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.865287 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs6l2\" (UniqueName: \"kubernetes.io/projected/9e3907de-aa66-4764-ba08-54e9c53c052d-kube-api-access-zs6l2\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.941563 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e3907de-aa66-4764-ba08-54e9c53c052d" (UID: "9e3907de-aa66-4764-ba08-54e9c53c052d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:03:26 crc kubenswrapper[4790]: I0406 13:03:26.967513 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3907de-aa66-4764-ba08-54e9c53c052d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.139586 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerID="740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98" exitCode=0 Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.139632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerDied","Data":"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98"} Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.139643 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjjg5" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.139659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjjg5" event={"ID":"9e3907de-aa66-4764-ba08-54e9c53c052d","Type":"ContainerDied","Data":"b56e3b68a302d68124a2b4f448aba1e3fa6e85570cc43291eb396d46f2072ac7"} Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.139677 4790 scope.go:117] "RemoveContainer" containerID="740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.166107 4790 scope.go:117] "RemoveContainer" containerID="a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.182046 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.193983 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wjjg5"] Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.210557 4790 scope.go:117] "RemoveContainer" containerID="f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.249239 4790 scope.go:117] "RemoveContainer" containerID="740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98" Apr 06 13:03:27 crc kubenswrapper[4790]: E0406 13:03:27.249667 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98\": container with ID starting with 740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98 not found: ID does not exist" containerID="740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.249691 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98"} err="failed to get container status \"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98\": rpc error: code = NotFound desc = could not find container \"740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98\": container with ID starting with 740551f1eea10d7842b33c11011ccb6ee12aff1d017183c041f7394397d55d98 not found: ID does not exist" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.249711 4790 scope.go:117] "RemoveContainer" containerID="a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e" Apr 06 13:03:27 crc kubenswrapper[4790]: E0406 13:03:27.250122 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e\": container with ID starting with a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e not found: ID does not exist" containerID="a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.250143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e"} err="failed to get container status \"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e\": rpc error: code = NotFound desc = could not find container \"a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e\": container with ID starting with a4b1e0dcbcc51cf4dc0715f0a17d07da64e99b871d66b2a449643c79bdf0a84e not found: ID does not exist" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.250156 4790 scope.go:117] "RemoveContainer" containerID="f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f" Apr 06 13:03:27 crc kubenswrapper[4790]: E0406 13:03:27.250600 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f\": container with ID starting with f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f not found: ID does not exist" containerID="f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.250650 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f"} err="failed to get container status \"f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f\": rpc error: code = NotFound desc = could not find container \"f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f\": container with ID starting with f699b58c76052ad9279ef1a78dae83b2b5eb418187d8c2cb3e65a666d61a978f not found: ID does not exist" Apr 06 13:03:27 crc kubenswrapper[4790]: I0406 13:03:27.686163 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" path="/var/lib/kubelet/pods/9e3907de-aa66-4764-ba08-54e9c53c052d/volumes" Apr 06 13:03:29 crc kubenswrapper[4790]: I0406 13:03:29.769283 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:29 crc kubenswrapper[4790]: I0406 13:03:29.769784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:29 crc kubenswrapper[4790]: I0406 13:03:29.818766 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:30 crc kubenswrapper[4790]: I0406 13:03:30.256439 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:31 crc kubenswrapper[4790]: I0406 13:03:31.225354 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.202886 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xzgjh" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="registry-server" containerID="cri-o://f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c" gracePeriod=2 Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.724266 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.786206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content\") pod \"27f8290d-1d23-4022-a48b-49972f3a1dcd\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.786545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c28l5\" (UniqueName: \"kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5\") pod \"27f8290d-1d23-4022-a48b-49972f3a1dcd\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.786586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities\") pod \"27f8290d-1d23-4022-a48b-49972f3a1dcd\" (UID: \"27f8290d-1d23-4022-a48b-49972f3a1dcd\") " Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.787351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities" (OuterVolumeSpecName: "utilities") pod "27f8290d-1d23-4022-a48b-49972f3a1dcd" (UID: "27f8290d-1d23-4022-a48b-49972f3a1dcd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.793433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5" (OuterVolumeSpecName: "kube-api-access-c28l5") pod "27f8290d-1d23-4022-a48b-49972f3a1dcd" (UID: "27f8290d-1d23-4022-a48b-49972f3a1dcd"). InnerVolumeSpecName "kube-api-access-c28l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.822688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27f8290d-1d23-4022-a48b-49972f3a1dcd" (UID: "27f8290d-1d23-4022-a48b-49972f3a1dcd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.889135 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.889176 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c28l5\" (UniqueName: \"kubernetes.io/projected/27f8290d-1d23-4022-a48b-49972f3a1dcd-kube-api-access-c28l5\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:32 crc kubenswrapper[4790]: I0406 13:03:32.889185 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27f8290d-1d23-4022-a48b-49972f3a1dcd-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.216253 4790 generic.go:334] "Generic (PLEG): container finished" podID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerID="f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c" exitCode=0 Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.216332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerDied","Data":"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c"} Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.216682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzgjh" event={"ID":"27f8290d-1d23-4022-a48b-49972f3a1dcd","Type":"ContainerDied","Data":"3d2dd218cafc7bf717a23eb8a5ffb4267d55706a5c972dbcfa0fa53becc9d1dd"} Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.216362 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzgjh" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.216723 4790 scope.go:117] "RemoveContainer" containerID="f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.255592 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.257589 4790 scope.go:117] "RemoveContainer" containerID="01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.273119 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzgjh"] Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.293262 4790 scope.go:117] "RemoveContainer" containerID="2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.338083 4790 scope.go:117] "RemoveContainer" containerID="f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c" Apr 06 13:03:33 crc kubenswrapper[4790]: E0406 13:03:33.338590 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c\": container with ID starting with f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c not found: ID does not exist" containerID="f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.338663 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c"} err="failed to get container status \"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c\": rpc error: code = NotFound desc = could not find container \"f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c\": container with ID starting with f1e3e784cfbd631fcf196ac62f58eb1fece2cb11f9c7210c585f0f6376e1d16c not found: ID does not exist" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.338691 4790 scope.go:117] "RemoveContainer" containerID="01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336" Apr 06 13:03:33 crc kubenswrapper[4790]: E0406 13:03:33.339125 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336\": container with ID starting with 01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336 not found: ID does not exist" containerID="01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.339173 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336"} err="failed to get container status \"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336\": rpc error: code = NotFound desc = could not find container \"01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336\": container with ID starting with 01530abe7de8c26dac571fd54fdbcdf6a34bc7fe2fadea5bd2e0ef46e268e336 not found: ID does not exist" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.339208 4790 scope.go:117] "RemoveContainer" containerID="2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e" Apr 06 13:03:33 crc kubenswrapper[4790]: E0406 13:03:33.339537 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e\": container with ID starting with 2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e not found: ID does not exist" containerID="2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.339571 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e"} err="failed to get container status \"2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e\": rpc error: code = NotFound desc = could not find container \"2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e\": container with ID starting with 2e82cc89590408d2e89c09b66a8a269eef58d161002356716109c0cdaa96a46e not found: ID does not exist" Apr 06 13:03:33 crc kubenswrapper[4790]: I0406 13:03:33.686872 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" path="/var/lib/kubelet/pods/27f8290d-1d23-4022-a48b-49972f3a1dcd/volumes" Apr 06 13:03:41 crc kubenswrapper[4790]: I0406 13:03:41.684767 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:03:41 crc kubenswrapper[4790]: E0406 13:03:41.685790 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:03:52 crc kubenswrapper[4790]: I0406 13:03:52.675818 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:03:52 crc kubenswrapper[4790]: E0406 13:03:52.676797 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.147627 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591344-q62r6"] Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.148614 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="extract-utilities" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.148634 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="extract-utilities" Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.148652 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="extract-content" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.148660 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="extract-content" Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.148684 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="extract-utilities" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.148748 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="extract-utilities" Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.148772 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="extract-content" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.148780 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="extract-content" Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.148806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.148814 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: E0406 13:04:00.149032 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.149045 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.149334 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3907de-aa66-4764-ba08-54e9c53c052d" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.149357 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f8290d-1d23-4022-a48b-49972f3a1dcd" containerName="registry-server" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.150244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.152264 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.152304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.158213 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591344-q62r6"] Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.159511 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.262902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwrq\" (UniqueName: \"kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq\") pod \"auto-csr-approver-29591344-q62r6\" (UID: \"6ee59c3f-ca35-4883-8273-afab95609f61\") " pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.370657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwrq\" (UniqueName: \"kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq\") pod \"auto-csr-approver-29591344-q62r6\" (UID: \"6ee59c3f-ca35-4883-8273-afab95609f61\") " pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.392765 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwrq\" (UniqueName: \"kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq\") pod \"auto-csr-approver-29591344-q62r6\" (UID: \"6ee59c3f-ca35-4883-8273-afab95609f61\") " pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.478407 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:00 crc kubenswrapper[4790]: I0406 13:04:00.989316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591344-q62r6"] Apr 06 13:04:01 crc kubenswrapper[4790]: I0406 13:04:01.529212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591344-q62r6" event={"ID":"6ee59c3f-ca35-4883-8273-afab95609f61","Type":"ContainerStarted","Data":"361fabba23dce8780413d0f403fa084e05728ba7d97c4c900e20f97657737e1c"} Apr 06 13:04:02 crc kubenswrapper[4790]: E0406 13:04:02.388535 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ee59c3f_ca35_4883_8273_afab95609f61.slice/crio-conmon-a1223c199f923ee00122d622aeeb4c235b17bd58ae2f520b2a521a7e7d0627aa.scope\": RecentStats: unable to find data in memory cache]" Apr 06 13:04:02 crc kubenswrapper[4790]: I0406 13:04:02.539663 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ee59c3f-ca35-4883-8273-afab95609f61" containerID="a1223c199f923ee00122d622aeeb4c235b17bd58ae2f520b2a521a7e7d0627aa" exitCode=0 Apr 06 13:04:02 crc kubenswrapper[4790]: I0406 13:04:02.539739 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591344-q62r6" event={"ID":"6ee59c3f-ca35-4883-8273-afab95609f61","Type":"ContainerDied","Data":"a1223c199f923ee00122d622aeeb4c235b17bd58ae2f520b2a521a7e7d0627aa"} Apr 06 13:04:03 crc kubenswrapper[4790]: I0406 13:04:03.911687 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.052299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxwrq\" (UniqueName: \"kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq\") pod \"6ee59c3f-ca35-4883-8273-afab95609f61\" (UID: \"6ee59c3f-ca35-4883-8273-afab95609f61\") " Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.057457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq" (OuterVolumeSpecName: "kube-api-access-zxwrq") pod "6ee59c3f-ca35-4883-8273-afab95609f61" (UID: "6ee59c3f-ca35-4883-8273-afab95609f61"). InnerVolumeSpecName "kube-api-access-zxwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.155060 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxwrq\" (UniqueName: \"kubernetes.io/projected/6ee59c3f-ca35-4883-8273-afab95609f61-kube-api-access-zxwrq\") on node \"crc\" DevicePath \"\"" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.559144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591344-q62r6" event={"ID":"6ee59c3f-ca35-4883-8273-afab95609f61","Type":"ContainerDied","Data":"361fabba23dce8780413d0f403fa084e05728ba7d97c4c900e20f97657737e1c"} Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.559427 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361fabba23dce8780413d0f403fa084e05728ba7d97c4c900e20f97657737e1c" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.559195 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591344-q62r6" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.676012 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:04:04 crc kubenswrapper[4790]: E0406 13:04:04.676224 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.976599 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591338-bn5m5"] Apr 06 13:04:04 crc kubenswrapper[4790]: I0406 13:04:04.988817 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591338-bn5m5"] Apr 06 13:04:05 crc kubenswrapper[4790]: I0406 13:04:05.690328 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb2fac17-ee5c-432b-a3b4-91294ca1e304" path="/var/lib/kubelet/pods/cb2fac17-ee5c-432b-a3b4-91294ca1e304/volumes" Apr 06 13:04:16 crc kubenswrapper[4790]: I0406 13:04:16.675864 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:04:16 crc kubenswrapper[4790]: E0406 13:04:16.676623 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:04:17 crc kubenswrapper[4790]: I0406 13:04:17.528511 4790 scope.go:117] "RemoveContainer" containerID="f6937211dc08b5049fd88a5a7fddb00b5252181ff99d549bd58145627d1df259" Apr 06 13:04:28 crc kubenswrapper[4790]: I0406 13:04:28.676425 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:04:28 crc kubenswrapper[4790]: E0406 13:04:28.677688 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:04:41 crc kubenswrapper[4790]: I0406 13:04:41.683675 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:04:41 crc kubenswrapper[4790]: E0406 13:04:41.684518 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:04:52 crc kubenswrapper[4790]: I0406 13:04:52.675860 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:04:52 crc kubenswrapper[4790]: E0406 13:04:52.676764 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:05:05 crc kubenswrapper[4790]: I0406 13:05:05.676300 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:05:05 crc kubenswrapper[4790]: E0406 13:05:05.676968 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.441847 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:12 crc kubenswrapper[4790]: E0406 13:05:12.443334 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee59c3f-ca35-4883-8273-afab95609f61" containerName="oc" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.443358 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee59c3f-ca35-4883-8273-afab95609f61" containerName="oc" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.443833 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee59c3f-ca35-4883-8273-afab95609f61" containerName="oc" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.448385 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.461092 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.533331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.533684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sb24\" (UniqueName: \"kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.533742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.635268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sb24\" (UniqueName: \"kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.635361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.635489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.636044 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.636613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.668067 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sb24\" (UniqueName: \"kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24\") pod \"community-operators-d77gt\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:12 crc kubenswrapper[4790]: I0406 13:05:12.791349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:13 crc kubenswrapper[4790]: I0406 13:05:13.385668 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:14 crc kubenswrapper[4790]: I0406 13:05:14.308100 4790 generic.go:334] "Generic (PLEG): container finished" podID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerID="50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a" exitCode=0 Apr 06 13:05:14 crc kubenswrapper[4790]: I0406 13:05:14.308145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerDied","Data":"50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a"} Apr 06 13:05:14 crc kubenswrapper[4790]: I0406 13:05:14.308781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerStarted","Data":"d5587e13e84d3131b895fc7fe168a1ff666815b4024b1a6f542d6a0ec988fb36"} Apr 06 13:05:16 crc kubenswrapper[4790]: I0406 13:05:16.330281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerStarted","Data":"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90"} Apr 06 13:05:17 crc kubenswrapper[4790]: I0406 13:05:17.340849 4790 generic.go:334] "Generic (PLEG): container finished" podID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerID="b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90" exitCode=0 Apr 06 13:05:17 crc kubenswrapper[4790]: I0406 13:05:17.340903 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerDied","Data":"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90"} Apr 06 13:05:18 crc kubenswrapper[4790]: I0406 13:05:18.363195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerStarted","Data":"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300"} Apr 06 13:05:18 crc kubenswrapper[4790]: I0406 13:05:18.389603 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d77gt" podStartSLOduration=2.908322798 podStartE2EDuration="6.389575538s" podCreationTimestamp="2026-04-06 13:05:12 +0000 UTC" firstStartedPulling="2026-04-06 13:05:14.310884853 +0000 UTC m=+4093.298627719" lastFinishedPulling="2026-04-06 13:05:17.792137583 +0000 UTC m=+4096.779880459" observedRunningTime="2026-04-06 13:05:18.385605152 +0000 UTC m=+4097.373348018" watchObservedRunningTime="2026-04-06 13:05:18.389575538 +0000 UTC m=+4097.377318414" Apr 06 13:05:18 crc kubenswrapper[4790]: I0406 13:05:18.674910 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:05:18 crc kubenswrapper[4790]: E0406 13:05:18.675348 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:05:22 crc kubenswrapper[4790]: I0406 13:05:22.791817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:22 crc kubenswrapper[4790]: I0406 13:05:22.792398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:22 crc kubenswrapper[4790]: I0406 13:05:22.854902 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:23 crc kubenswrapper[4790]: I0406 13:05:23.453989 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:23 crc kubenswrapper[4790]: I0406 13:05:23.503197 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:25 crc kubenswrapper[4790]: I0406 13:05:25.453222 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d77gt" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="registry-server" containerID="cri-o://c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300" gracePeriod=2 Apr 06 13:05:25 crc kubenswrapper[4790]: I0406 13:05:25.961904 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.068901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities\") pod \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.068967 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content\") pod \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.068997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sb24\" (UniqueName: \"kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24\") pod \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\" (UID: \"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe\") " Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.070067 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities" (OuterVolumeSpecName: "utilities") pod "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" (UID: "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.076156 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24" (OuterVolumeSpecName: "kube-api-access-7sb24") pod "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" (UID: "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe"). InnerVolumeSpecName "kube-api-access-7sb24". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.130547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" (UID: "a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.172355 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.172394 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.172411 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sb24\" (UniqueName: \"kubernetes.io/projected/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe-kube-api-access-7sb24\") on node \"crc\" DevicePath \"\"" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.467129 4790 generic.go:334] "Generic (PLEG): container finished" podID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerID="c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300" exitCode=0 Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.467198 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d77gt" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.467221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerDied","Data":"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300"} Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.467324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d77gt" event={"ID":"a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe","Type":"ContainerDied","Data":"d5587e13e84d3131b895fc7fe168a1ff666815b4024b1a6f542d6a0ec988fb36"} Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.467366 4790 scope.go:117] "RemoveContainer" containerID="c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.495362 4790 scope.go:117] "RemoveContainer" containerID="b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90" Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.511870 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:26 crc kubenswrapper[4790]: I0406 13:05:26.524602 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d77gt"] Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.154273 4790 scope.go:117] "RemoveContainer" containerID="50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.460223 4790 scope.go:117] "RemoveContainer" containerID="c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300" Apr 06 13:05:27 crc kubenswrapper[4790]: E0406 13:05:27.461314 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300\": container with ID starting with c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300 not found: ID does not exist" containerID="c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.461367 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300"} err="failed to get container status \"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300\": rpc error: code = NotFound desc = could not find container \"c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300\": container with ID starting with c9e850778bc66061bb6ee072f848a8301b813d1bf8d18260bfa69e1768ffd300 not found: ID does not exist" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.461405 4790 scope.go:117] "RemoveContainer" containerID="b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90" Apr 06 13:05:27 crc kubenswrapper[4790]: E0406 13:05:27.462098 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90\": container with ID starting with b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90 not found: ID does not exist" containerID="b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.462124 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90"} err="failed to get container status \"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90\": rpc error: code = NotFound desc = could not find container \"b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90\": container with ID starting with b96c8f7ba264853ad4658d67cc9be04c39e99f6c86388d0e4b3579467629ee90 not found: ID does not exist" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.462140 4790 scope.go:117] "RemoveContainer" containerID="50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a" Apr 06 13:05:27 crc kubenswrapper[4790]: E0406 13:05:27.462735 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a\": container with ID starting with 50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a not found: ID does not exist" containerID="50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.462780 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a"} err="failed to get container status \"50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a\": rpc error: code = NotFound desc = could not find container \"50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a\": container with ID starting with 50019a4c96dc4ba613686e34b79063ccdbeda9eb0ac157280500e7765a05521a not found: ID does not exist" Apr 06 13:05:27 crc kubenswrapper[4790]: I0406 13:05:27.686600 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" path="/var/lib/kubelet/pods/a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe/volumes" Apr 06 13:05:30 crc kubenswrapper[4790]: I0406 13:05:30.675569 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:05:30 crc kubenswrapper[4790]: E0406 13:05:30.676256 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:05:44 crc kubenswrapper[4790]: I0406 13:05:44.676186 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:05:44 crc kubenswrapper[4790]: E0406 13:05:44.677074 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:05:58 crc kubenswrapper[4790]: I0406 13:05:58.675929 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:05:58 crc kubenswrapper[4790]: E0406 13:05:58.676683 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.155067 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591346-7rk9h"] Apr 06 13:06:00 crc kubenswrapper[4790]: E0406 13:06:00.155922 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="extract-content" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.155941 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="extract-content" Apr 06 13:06:00 crc kubenswrapper[4790]: E0406 13:06:00.155991 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="extract-utilities" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.156001 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="extract-utilities" Apr 06 13:06:00 crc kubenswrapper[4790]: E0406 13:06:00.156021 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="registry-server" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.156032 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="registry-server" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.156339 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36bc9e6-c7df-4c4a-b7d3-a35d102b90fe" containerName="registry-server" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.157210 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.159853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.160236 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.160248 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.165517 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591346-7rk9h"] Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.283332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8s6l\" (UniqueName: \"kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l\") pod \"auto-csr-approver-29591346-7rk9h\" (UID: \"7d64309a-814a-4fbe-99c6-bf52984869ff\") " pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.385464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8s6l\" (UniqueName: \"kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l\") pod \"auto-csr-approver-29591346-7rk9h\" (UID: \"7d64309a-814a-4fbe-99c6-bf52984869ff\") " pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.411721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8s6l\" (UniqueName: \"kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l\") pod \"auto-csr-approver-29591346-7rk9h\" (UID: \"7d64309a-814a-4fbe-99c6-bf52984869ff\") " pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.480719 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:00 crc kubenswrapper[4790]: I0406 13:06:00.969864 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591346-7rk9h"] Apr 06 13:06:01 crc kubenswrapper[4790]: I0406 13:06:01.837974 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" event={"ID":"7d64309a-814a-4fbe-99c6-bf52984869ff","Type":"ContainerStarted","Data":"325a8c7421c4f3cecfd07eb827b42fc09222297cb16580e78ef67ae3c51c0fbd"} Apr 06 13:06:02 crc kubenswrapper[4790]: I0406 13:06:02.848325 4790 generic.go:334] "Generic (PLEG): container finished" podID="7d64309a-814a-4fbe-99c6-bf52984869ff" containerID="b5a03476c1a88f43fb0f087dc7bb8f1fb438c17941cfd2149f5f58b070d518af" exitCode=0 Apr 06 13:06:02 crc kubenswrapper[4790]: I0406 13:06:02.848602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" event={"ID":"7d64309a-814a-4fbe-99c6-bf52984869ff","Type":"ContainerDied","Data":"b5a03476c1a88f43fb0f087dc7bb8f1fb438c17941cfd2149f5f58b070d518af"} Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.340302 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.443927 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8s6l\" (UniqueName: \"kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l\") pod \"7d64309a-814a-4fbe-99c6-bf52984869ff\" (UID: \"7d64309a-814a-4fbe-99c6-bf52984869ff\") " Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.464779 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l" (OuterVolumeSpecName: "kube-api-access-z8s6l") pod "7d64309a-814a-4fbe-99c6-bf52984869ff" (UID: "7d64309a-814a-4fbe-99c6-bf52984869ff"). InnerVolumeSpecName "kube-api-access-z8s6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.548299 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8s6l\" (UniqueName: \"kubernetes.io/projected/7d64309a-814a-4fbe-99c6-bf52984869ff-kube-api-access-z8s6l\") on node \"crc\" DevicePath \"\"" Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.872566 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" event={"ID":"7d64309a-814a-4fbe-99c6-bf52984869ff","Type":"ContainerDied","Data":"325a8c7421c4f3cecfd07eb827b42fc09222297cb16580e78ef67ae3c51c0fbd"} Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.872623 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325a8c7421c4f3cecfd07eb827b42fc09222297cb16580e78ef67ae3c51c0fbd" Apr 06 13:06:04 crc kubenswrapper[4790]: I0406 13:06:04.872724 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591346-7rk9h" Apr 06 13:06:05 crc kubenswrapper[4790]: I0406 13:06:05.421802 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591340-9zdcs"] Apr 06 13:06:05 crc kubenswrapper[4790]: I0406 13:06:05.433564 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591340-9zdcs"] Apr 06 13:06:05 crc kubenswrapper[4790]: I0406 13:06:05.685331 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543fadab-d27e-43d4-accd-0d3259e3f783" path="/var/lib/kubelet/pods/543fadab-d27e-43d4-accd-0d3259e3f783/volumes" Apr 06 13:06:13 crc kubenswrapper[4790]: I0406 13:06:13.676277 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:06:13 crc kubenswrapper[4790]: E0406 13:06:13.677045 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:06:26 crc kubenswrapper[4790]: I0406 13:06:26.676411 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:06:26 crc kubenswrapper[4790]: E0406 13:06:26.677851 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:06:37 crc kubenswrapper[4790]: I0406 13:06:37.677619 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:06:37 crc kubenswrapper[4790]: E0406 13:06:37.679197 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:06:49 crc kubenswrapper[4790]: I0406 13:06:49.675426 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:06:50 crc kubenswrapper[4790]: I0406 13:06:50.365433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42"} Apr 06 13:07:17 crc kubenswrapper[4790]: I0406 13:07:17.731554 4790 scope.go:117] "RemoveContainer" containerID="492a93173d79e42647804670e4f4a3015f760cf96a3e09b9765e438b993f5ece" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.147981 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591348-c8v2l"] Apr 06 13:08:00 crc kubenswrapper[4790]: E0406 13:08:00.149909 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d64309a-814a-4fbe-99c6-bf52984869ff" containerName="oc" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.150006 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d64309a-814a-4fbe-99c6-bf52984869ff" containerName="oc" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.150288 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d64309a-814a-4fbe-99c6-bf52984869ff" containerName="oc" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.150987 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.152754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.154983 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.156291 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.165263 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591348-c8v2l"] Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.282852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7rjw\" (UniqueName: \"kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw\") pod \"auto-csr-approver-29591348-c8v2l\" (UID: \"9aa241a8-092d-4bcb-8378-4a08991ffc2b\") " pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.385723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7rjw\" (UniqueName: \"kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw\") pod \"auto-csr-approver-29591348-c8v2l\" (UID: \"9aa241a8-092d-4bcb-8378-4a08991ffc2b\") " pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.440881 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7rjw\" (UniqueName: \"kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw\") pod \"auto-csr-approver-29591348-c8v2l\" (UID: \"9aa241a8-092d-4bcb-8378-4a08991ffc2b\") " pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:00 crc kubenswrapper[4790]: I0406 13:08:00.483326 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:01 crc kubenswrapper[4790]: I0406 13:08:01.083233 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591348-c8v2l"] Apr 06 13:08:01 crc kubenswrapper[4790]: I0406 13:08:01.098208 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:08:02 crc kubenswrapper[4790]: I0406 13:08:02.114614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" event={"ID":"9aa241a8-092d-4bcb-8378-4a08991ffc2b","Type":"ContainerStarted","Data":"a435bb5fb633ba6e8afdc6c95a3c7e35d47ecb4e084d79aaf51baa58b352303b"} Apr 06 13:08:03 crc kubenswrapper[4790]: I0406 13:08:03.125696 4790 generic.go:334] "Generic (PLEG): container finished" podID="9aa241a8-092d-4bcb-8378-4a08991ffc2b" containerID="de5c26cd4bd3d9f60ce82b6a7021f4f0de31aa66f7aab31e1ba9c91d99b39d2f" exitCode=0 Apr 06 13:08:03 crc kubenswrapper[4790]: I0406 13:08:03.125754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" event={"ID":"9aa241a8-092d-4bcb-8378-4a08991ffc2b","Type":"ContainerDied","Data":"de5c26cd4bd3d9f60ce82b6a7021f4f0de31aa66f7aab31e1ba9c91d99b39d2f"} Apr 06 13:08:04 crc kubenswrapper[4790]: I0406 13:08:04.505124 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:04 crc kubenswrapper[4790]: I0406 13:08:04.680331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7rjw\" (UniqueName: \"kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw\") pod \"9aa241a8-092d-4bcb-8378-4a08991ffc2b\" (UID: \"9aa241a8-092d-4bcb-8378-4a08991ffc2b\") " Apr 06 13:08:04 crc kubenswrapper[4790]: I0406 13:08:04.685930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw" (OuterVolumeSpecName: "kube-api-access-x7rjw") pod "9aa241a8-092d-4bcb-8378-4a08991ffc2b" (UID: "9aa241a8-092d-4bcb-8378-4a08991ffc2b"). InnerVolumeSpecName "kube-api-access-x7rjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:08:04 crc kubenswrapper[4790]: I0406 13:08:04.783425 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7rjw\" (UniqueName: \"kubernetes.io/projected/9aa241a8-092d-4bcb-8378-4a08991ffc2b-kube-api-access-x7rjw\") on node \"crc\" DevicePath \"\"" Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.146942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" event={"ID":"9aa241a8-092d-4bcb-8378-4a08991ffc2b","Type":"ContainerDied","Data":"a435bb5fb633ba6e8afdc6c95a3c7e35d47ecb4e084d79aaf51baa58b352303b"} Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.147005 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a435bb5fb633ba6e8afdc6c95a3c7e35d47ecb4e084d79aaf51baa58b352303b" Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.147018 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591348-c8v2l" Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.570395 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591342-hj2fd"] Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.580621 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591342-hj2fd"] Apr 06 13:08:05 crc kubenswrapper[4790]: I0406 13:08:05.688873 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6fdc37-78cb-4947-8493-7633660b6c40" path="/var/lib/kubelet/pods/de6fdc37-78cb-4947-8493-7633660b6c40/volumes" Apr 06 13:08:17 crc kubenswrapper[4790]: I0406 13:08:17.821558 4790 scope.go:117] "RemoveContainer" containerID="1d1869fe96a8272933b4653c63b22e5e586dfe1fe7942c4e9a249f727886d3fe" Apr 06 13:09:09 crc kubenswrapper[4790]: I0406 13:09:09.753393 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:09:09 crc kubenswrapper[4790]: I0406 13:09:09.753978 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:09:39 crc kubenswrapper[4790]: I0406 13:09:39.753945 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:09:39 crc kubenswrapper[4790]: I0406 13:09:39.754513 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.146804 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591350-mfz2m"] Apr 06 13:10:00 crc kubenswrapper[4790]: E0406 13:10:00.148035 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa241a8-092d-4bcb-8378-4a08991ffc2b" containerName="oc" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.148051 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa241a8-092d-4bcb-8378-4a08991ffc2b" containerName="oc" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.148293 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa241a8-092d-4bcb-8378-4a08991ffc2b" containerName="oc" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.149180 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.151084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.151394 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.156011 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.157000 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591350-mfz2m"] Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.279849 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9gb\" (UniqueName: \"kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb\") pod \"auto-csr-approver-29591350-mfz2m\" (UID: \"482b36aa-c70d-4583-94a3-8cceeac309e5\") " pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.382159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9gb\" (UniqueName: \"kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb\") pod \"auto-csr-approver-29591350-mfz2m\" (UID: \"482b36aa-c70d-4583-94a3-8cceeac309e5\") " pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.422861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9gb\" (UniqueName: \"kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb\") pod \"auto-csr-approver-29591350-mfz2m\" (UID: \"482b36aa-c70d-4583-94a3-8cceeac309e5\") " pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.470946 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:00 crc kubenswrapper[4790]: I0406 13:10:00.926778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591350-mfz2m"] Apr 06 13:10:00 crc kubenswrapper[4790]: W0406 13:10:00.926955 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod482b36aa_c70d_4583_94a3_8cceeac309e5.slice/crio-eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9 WatchSource:0}: Error finding container eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9: Status 404 returned error can't find the container with id eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9 Apr 06 13:10:01 crc kubenswrapper[4790]: I0406 13:10:01.318163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" event={"ID":"482b36aa-c70d-4583-94a3-8cceeac309e5","Type":"ContainerStarted","Data":"eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9"} Apr 06 13:10:05 crc kubenswrapper[4790]: I0406 13:10:05.359665 4790 generic.go:334] "Generic (PLEG): container finished" podID="482b36aa-c70d-4583-94a3-8cceeac309e5" containerID="15a8cda7cd80cb4e48e1d3ca04cee5da5c682e7fec28dd4b004e1238dc5fe772" exitCode=0 Apr 06 13:10:05 crc kubenswrapper[4790]: I0406 13:10:05.359796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" event={"ID":"482b36aa-c70d-4583-94a3-8cceeac309e5","Type":"ContainerDied","Data":"15a8cda7cd80cb4e48e1d3ca04cee5da5c682e7fec28dd4b004e1238dc5fe772"} Apr 06 13:10:06 crc kubenswrapper[4790]: I0406 13:10:06.795705 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:06 crc kubenswrapper[4790]: I0406 13:10:06.913039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9gb\" (UniqueName: \"kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb\") pod \"482b36aa-c70d-4583-94a3-8cceeac309e5\" (UID: \"482b36aa-c70d-4583-94a3-8cceeac309e5\") " Apr 06 13:10:06 crc kubenswrapper[4790]: I0406 13:10:06.922094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb" (OuterVolumeSpecName: "kube-api-access-2f9gb") pod "482b36aa-c70d-4583-94a3-8cceeac309e5" (UID: "482b36aa-c70d-4583-94a3-8cceeac309e5"). InnerVolumeSpecName "kube-api-access-2f9gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.016784 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9gb\" (UniqueName: \"kubernetes.io/projected/482b36aa-c70d-4583-94a3-8cceeac309e5-kube-api-access-2f9gb\") on node \"crc\" DevicePath \"\"" Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.380956 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" event={"ID":"482b36aa-c70d-4583-94a3-8cceeac309e5","Type":"ContainerDied","Data":"eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9"} Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.381018 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591350-mfz2m" Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.381027 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca8403bf0996927f069aa23c3ff069cdb13a69619aee2d6baaaa8564560aaf9" Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.868382 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591344-q62r6"] Apr 06 13:10:07 crc kubenswrapper[4790]: I0406 13:10:07.879769 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591344-q62r6"] Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.688905 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee59c3f-ca35-4883-8273-afab95609f61" path="/var/lib/kubelet/pods/6ee59c3f-ca35-4883-8273-afab95609f61/volumes" Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.753416 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.753508 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.753573 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.754702 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:10:09 crc kubenswrapper[4790]: I0406 13:10:09.754778 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42" gracePeriod=600 Apr 06 13:10:10 crc kubenswrapper[4790]: I0406 13:10:10.417675 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42" exitCode=0 Apr 06 13:10:10 crc kubenswrapper[4790]: I0406 13:10:10.417734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42"} Apr 06 13:10:10 crc kubenswrapper[4790]: I0406 13:10:10.418638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270"} Apr 06 13:10:10 crc kubenswrapper[4790]: I0406 13:10:10.418676 4790 scope.go:117] "RemoveContainer" containerID="997aeb263d962fedbbf5f298b3740d3aac9b9c34264e2fb8f90d9199e29d94ba" Apr 06 13:10:17 crc kubenswrapper[4790]: I0406 13:10:17.924227 4790 scope.go:117] "RemoveContainer" containerID="a1223c199f923ee00122d622aeeb4c235b17bd58ae2f520b2a521a7e7d0627aa" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.147068 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591352-phnkb"] Apr 06 13:12:00 crc kubenswrapper[4790]: E0406 13:12:00.148108 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482b36aa-c70d-4583-94a3-8cceeac309e5" containerName="oc" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.148127 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b36aa-c70d-4583-94a3-8cceeac309e5" containerName="oc" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.148387 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="482b36aa-c70d-4583-94a3-8cceeac309e5" containerName="oc" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.149274 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.151513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.151783 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.156561 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.156768 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591352-phnkb"] Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.269002 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmn8p\" (UniqueName: \"kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p\") pod \"auto-csr-approver-29591352-phnkb\" (UID: \"db2b0c38-6596-44b6-8955-75b3aa8455d5\") " pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.370880 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmn8p\" (UniqueName: \"kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p\") pod \"auto-csr-approver-29591352-phnkb\" (UID: \"db2b0c38-6596-44b6-8955-75b3aa8455d5\") " pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.389501 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmn8p\" (UniqueName: \"kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p\") pod \"auto-csr-approver-29591352-phnkb\" (UID: \"db2b0c38-6596-44b6-8955-75b3aa8455d5\") " pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.468586 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:00 crc kubenswrapper[4790]: I0406 13:12:00.922584 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591352-phnkb"] Apr 06 13:12:01 crc kubenswrapper[4790]: I0406 13:12:01.520643 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591352-phnkb" event={"ID":"db2b0c38-6596-44b6-8955-75b3aa8455d5","Type":"ContainerStarted","Data":"19949f46de7770da0598dad26677121c8feb14668f64d505063648fae16ebf21"} Apr 06 13:12:02 crc kubenswrapper[4790]: I0406 13:12:02.533167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591352-phnkb" event={"ID":"db2b0c38-6596-44b6-8955-75b3aa8455d5","Type":"ContainerStarted","Data":"9a07aca31f69a0b055140458e368808d34688b32ee86886b877d7a4c13c308e5"} Apr 06 13:12:02 crc kubenswrapper[4790]: I0406 13:12:02.555450 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591352-phnkb" podStartSLOduration=1.401114951 podStartE2EDuration="2.555432545s" podCreationTimestamp="2026-04-06 13:12:00 +0000 UTC" firstStartedPulling="2026-04-06 13:12:00.933986759 +0000 UTC m=+4499.921729625" lastFinishedPulling="2026-04-06 13:12:02.088304353 +0000 UTC m=+4501.076047219" observedRunningTime="2026-04-06 13:12:02.548112409 +0000 UTC m=+4501.535855295" watchObservedRunningTime="2026-04-06 13:12:02.555432545 +0000 UTC m=+4501.543175411" Apr 06 13:12:03 crc kubenswrapper[4790]: I0406 13:12:03.543801 4790 generic.go:334] "Generic (PLEG): container finished" podID="db2b0c38-6596-44b6-8955-75b3aa8455d5" containerID="9a07aca31f69a0b055140458e368808d34688b32ee86886b877d7a4c13c308e5" exitCode=0 Apr 06 13:12:03 crc kubenswrapper[4790]: I0406 13:12:03.543875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591352-phnkb" event={"ID":"db2b0c38-6596-44b6-8955-75b3aa8455d5","Type":"ContainerDied","Data":"9a07aca31f69a0b055140458e368808d34688b32ee86886b877d7a4c13c308e5"} Apr 06 13:12:04 crc kubenswrapper[4790]: I0406 13:12:04.926964 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.080142 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmn8p\" (UniqueName: \"kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p\") pod \"db2b0c38-6596-44b6-8955-75b3aa8455d5\" (UID: \"db2b0c38-6596-44b6-8955-75b3aa8455d5\") " Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.096659 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p" (OuterVolumeSpecName: "kube-api-access-cmn8p") pod "db2b0c38-6596-44b6-8955-75b3aa8455d5" (UID: "db2b0c38-6596-44b6-8955-75b3aa8455d5"). InnerVolumeSpecName "kube-api-access-cmn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.183867 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmn8p\" (UniqueName: \"kubernetes.io/projected/db2b0c38-6596-44b6-8955-75b3aa8455d5-kube-api-access-cmn8p\") on node \"crc\" DevicePath \"\"" Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.573345 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591352-phnkb" event={"ID":"db2b0c38-6596-44b6-8955-75b3aa8455d5","Type":"ContainerDied","Data":"19949f46de7770da0598dad26677121c8feb14668f64d505063648fae16ebf21"} Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.573629 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19949f46de7770da0598dad26677121c8feb14668f64d505063648fae16ebf21" Apr 06 13:12:05 crc kubenswrapper[4790]: I0406 13:12:05.573684 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591352-phnkb" Apr 06 13:12:06 crc kubenswrapper[4790]: I0406 13:12:06.005100 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591346-7rk9h"] Apr 06 13:12:06 crc kubenswrapper[4790]: I0406 13:12:06.021104 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591346-7rk9h"] Apr 06 13:12:07 crc kubenswrapper[4790]: I0406 13:12:07.686302 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d64309a-814a-4fbe-99c6-bf52984869ff" path="/var/lib/kubelet/pods/7d64309a-814a-4fbe-99c6-bf52984869ff/volumes" Apr 06 13:12:18 crc kubenswrapper[4790]: I0406 13:12:18.039827 4790 scope.go:117] "RemoveContainer" containerID="b5a03476c1a88f43fb0f087dc7bb8f1fb438c17941cfd2149f5f58b070d518af" Apr 06 13:12:39 crc kubenswrapper[4790]: I0406 13:12:39.753298 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:12:39 crc kubenswrapper[4790]: I0406 13:12:39.753758 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.201973 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:12:51 crc kubenswrapper[4790]: E0406 13:12:51.203275 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2b0c38-6596-44b6-8955-75b3aa8455d5" containerName="oc" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.203293 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2b0c38-6596-44b6-8955-75b3aa8455d5" containerName="oc" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.203575 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2b0c38-6596-44b6-8955-75b3aa8455d5" containerName="oc" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.205271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.214188 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.312035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.312134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.312521 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8xm7\" (UniqueName: \"kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.414599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.414755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.414999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8xm7\" (UniqueName: \"kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.415099 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.415464 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.434558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8xm7\" (UniqueName: \"kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7\") pod \"certified-operators-lrss7\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:51 crc kubenswrapper[4790]: I0406 13:12:51.529880 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:12:52 crc kubenswrapper[4790]: I0406 13:12:52.095179 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:12:53 crc kubenswrapper[4790]: I0406 13:12:53.031233 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerID="83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a" exitCode=0 Apr 06 13:12:53 crc kubenswrapper[4790]: I0406 13:12:53.031495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerDied","Data":"83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a"} Apr 06 13:12:53 crc kubenswrapper[4790]: I0406 13:12:53.031521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerStarted","Data":"c55dfca34920920c3c28cdd8999460f9a68b9aa14c96804366efbaa09ac4b8a1"} Apr 06 13:12:54 crc kubenswrapper[4790]: I0406 13:12:54.044076 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerStarted","Data":"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8"} Apr 06 13:12:56 crc kubenswrapper[4790]: I0406 13:12:56.066670 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerID="7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8" exitCode=0 Apr 06 13:12:56 crc kubenswrapper[4790]: I0406 13:12:56.067214 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerDied","Data":"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8"} Apr 06 13:12:57 crc kubenswrapper[4790]: I0406 13:12:57.079280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerStarted","Data":"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616"} Apr 06 13:12:57 crc kubenswrapper[4790]: I0406 13:12:57.105396 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lrss7" podStartSLOduration=2.690172199 podStartE2EDuration="6.105379765s" podCreationTimestamp="2026-04-06 13:12:51 +0000 UTC" firstStartedPulling="2026-04-06 13:12:53.033251566 +0000 UTC m=+4552.020994442" lastFinishedPulling="2026-04-06 13:12:56.448459142 +0000 UTC m=+4555.436202008" observedRunningTime="2026-04-06 13:12:57.096151269 +0000 UTC m=+4556.083894135" watchObservedRunningTime="2026-04-06 13:12:57.105379765 +0000 UTC m=+4556.093122631" Apr 06 13:13:01 crc kubenswrapper[4790]: I0406 13:13:01.530146 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:01 crc kubenswrapper[4790]: I0406 13:13:01.530536 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:01 crc kubenswrapper[4790]: I0406 13:13:01.602840 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:02 crc kubenswrapper[4790]: I0406 13:13:02.166724 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:02 crc kubenswrapper[4790]: I0406 13:13:02.215572 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.140483 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lrss7" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="registry-server" containerID="cri-o://59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616" gracePeriod=2 Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.254683 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.257479 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.269708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.385754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.385883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.386146 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwj7\" (UniqueName: \"kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.488954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwj7\" (UniqueName: \"kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.489291 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.489387 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.489952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.489964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.514935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwj7\" (UniqueName: \"kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7\") pod \"redhat-operators-8hnpz\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.619421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.714376 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.797493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities\") pod \"8c828dfb-8e31-4837-9091-65ced2f26bd4\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.798014 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8xm7\" (UniqueName: \"kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7\") pod \"8c828dfb-8e31-4837-9091-65ced2f26bd4\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.798164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content\") pod \"8c828dfb-8e31-4837-9091-65ced2f26bd4\" (UID: \"8c828dfb-8e31-4837-9091-65ced2f26bd4\") " Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.798491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities" (OuterVolumeSpecName: "utilities") pod "8c828dfb-8e31-4837-9091-65ced2f26bd4" (UID: "8c828dfb-8e31-4837-9091-65ced2f26bd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.802056 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.805717 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7" (OuterVolumeSpecName: "kube-api-access-n8xm7") pod "8c828dfb-8e31-4837-9091-65ced2f26bd4" (UID: "8c828dfb-8e31-4837-9091-65ced2f26bd4"). InnerVolumeSpecName "kube-api-access-n8xm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.877429 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c828dfb-8e31-4837-9091-65ced2f26bd4" (UID: "8c828dfb-8e31-4837-9091-65ced2f26bd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.904739 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8xm7\" (UniqueName: \"kubernetes.io/projected/8c828dfb-8e31-4837-9091-65ced2f26bd4-kube-api-access-n8xm7\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:04 crc kubenswrapper[4790]: I0406 13:13:04.904773 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c828dfb-8e31-4837-9091-65ced2f26bd4-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.151669 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerID="59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616" exitCode=0 Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.151726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerDied","Data":"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616"} Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.151761 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lrss7" event={"ID":"8c828dfb-8e31-4837-9091-65ced2f26bd4","Type":"ContainerDied","Data":"c55dfca34920920c3c28cdd8999460f9a68b9aa14c96804366efbaa09ac4b8a1"} Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.151787 4790 scope.go:117] "RemoveContainer" containerID="59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.151795 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lrss7" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.178271 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.190079 4790 scope.go:117] "RemoveContainer" containerID="7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.201589 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.212928 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lrss7"] Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.248706 4790 scope.go:117] "RemoveContainer" containerID="83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.355883 4790 scope.go:117] "RemoveContainer" containerID="59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616" Apr 06 13:13:05 crc kubenswrapper[4790]: E0406 13:13:05.356308 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616\": container with ID starting with 59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616 not found: ID does not exist" containerID="59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.356429 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616"} err="failed to get container status \"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616\": rpc error: code = NotFound desc = could not find container \"59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616\": container with ID starting with 59584fc0289e76782c4f122bc41d344d42ba69098fcd61295fe4097ddab0e616 not found: ID does not exist" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.356529 4790 scope.go:117] "RemoveContainer" containerID="7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8" Apr 06 13:13:05 crc kubenswrapper[4790]: E0406 13:13:05.357629 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8\": container with ID starting with 7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8 not found: ID does not exist" containerID="7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.357680 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8"} err="failed to get container status \"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8\": rpc error: code = NotFound desc = could not find container \"7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8\": container with ID starting with 7645eaacde47f9daa080d2adcca4d42d23f21b531df27d79a3c94649104be8f8 not found: ID does not exist" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.357708 4790 scope.go:117] "RemoveContainer" containerID="83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a" Apr 06 13:13:05 crc kubenswrapper[4790]: E0406 13:13:05.358066 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a\": container with ID starting with 83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a not found: ID does not exist" containerID="83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.358107 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a"} err="failed to get container status \"83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a\": rpc error: code = NotFound desc = could not find container \"83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a\": container with ID starting with 83e14ff9d7cefeb2a7bd03487582dce4e503d22faf42cdcddaa8cb48ca5dd51a not found: ID does not exist" Apr 06 13:13:05 crc kubenswrapper[4790]: I0406 13:13:05.686931 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" path="/var/lib/kubelet/pods/8c828dfb-8e31-4837-9091-65ced2f26bd4/volumes" Apr 06 13:13:06 crc kubenswrapper[4790]: I0406 13:13:06.162452 4790 generic.go:334] "Generic (PLEG): container finished" podID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerID="3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e" exitCode=0 Apr 06 13:13:06 crc kubenswrapper[4790]: I0406 13:13:06.162489 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerDied","Data":"3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e"} Apr 06 13:13:06 crc kubenswrapper[4790]: I0406 13:13:06.162879 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerStarted","Data":"f11a5039c6c2b8fb84ca809238efef44d6858be14b34b82331697305cecbf2eb"} Apr 06 13:13:06 crc kubenswrapper[4790]: I0406 13:13:06.164340 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:13:07 crc kubenswrapper[4790]: I0406 13:13:07.175043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerStarted","Data":"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40"} Apr 06 13:13:09 crc kubenswrapper[4790]: I0406 13:13:09.753308 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:13:09 crc kubenswrapper[4790]: I0406 13:13:09.754578 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:13:11 crc kubenswrapper[4790]: I0406 13:13:11.218793 4790 generic.go:334] "Generic (PLEG): container finished" podID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerID="85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40" exitCode=0 Apr 06 13:13:11 crc kubenswrapper[4790]: I0406 13:13:11.218865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerDied","Data":"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40"} Apr 06 13:13:12 crc kubenswrapper[4790]: I0406 13:13:12.230904 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerStarted","Data":"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02"} Apr 06 13:13:12 crc kubenswrapper[4790]: I0406 13:13:12.257655 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hnpz" podStartSLOduration=2.803836462 podStartE2EDuration="8.25762863s" podCreationTimestamp="2026-04-06 13:13:04 +0000 UTC" firstStartedPulling="2026-04-06 13:13:06.164030847 +0000 UTC m=+4565.151773703" lastFinishedPulling="2026-04-06 13:13:11.617823005 +0000 UTC m=+4570.605565871" observedRunningTime="2026-04-06 13:13:12.248499926 +0000 UTC m=+4571.236242792" watchObservedRunningTime="2026-04-06 13:13:12.25762863 +0000 UTC m=+4571.245371496" Apr 06 13:13:14 crc kubenswrapper[4790]: I0406 13:13:14.619953 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:14 crc kubenswrapper[4790]: I0406 13:13:14.620484 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:15 crc kubenswrapper[4790]: I0406 13:13:15.669177 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hnpz" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="registry-server" probeResult="failure" output=< Apr 06 13:13:15 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:13:15 crc kubenswrapper[4790]: > Apr 06 13:13:24 crc kubenswrapper[4790]: I0406 13:13:24.667578 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:24 crc kubenswrapper[4790]: I0406 13:13:24.736352 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:24 crc kubenswrapper[4790]: I0406 13:13:24.904747 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.357764 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hnpz" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="registry-server" containerID="cri-o://3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02" gracePeriod=2 Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.868287 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.949347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content\") pod \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.949460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vwj7\" (UniqueName: \"kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7\") pod \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.949695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities\") pod \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\" (UID: \"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff\") " Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.950164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities" (OuterVolumeSpecName: "utilities") pod "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" (UID: "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.950344 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:26 crc kubenswrapper[4790]: I0406 13:13:26.961147 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7" (OuterVolumeSpecName: "kube-api-access-6vwj7") pod "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" (UID: "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff"). InnerVolumeSpecName "kube-api-access-6vwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.051862 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vwj7\" (UniqueName: \"kubernetes.io/projected/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-kube-api-access-6vwj7\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.093914 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" (UID: "fbbfbc5f-c439-4458-a69e-f4c8e57f17ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.153757 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.371431 4790 generic.go:334] "Generic (PLEG): container finished" podID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerID="3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02" exitCode=0 Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.371473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerDied","Data":"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02"} Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.371524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hnpz" event={"ID":"fbbfbc5f-c439-4458-a69e-f4c8e57f17ff","Type":"ContainerDied","Data":"f11a5039c6c2b8fb84ca809238efef44d6858be14b34b82331697305cecbf2eb"} Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.371554 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hnpz" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.371545 4790 scope.go:117] "RemoveContainer" containerID="3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.394356 4790 scope.go:117] "RemoveContainer" containerID="85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.414973 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.425401 4790 scope.go:117] "RemoveContainer" containerID="3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.426612 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hnpz"] Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.472154 4790 scope.go:117] "RemoveContainer" containerID="3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.472496 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02\": container with ID starting with 3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02 not found: ID does not exist" containerID="3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.472532 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02"} err="failed to get container status \"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02\": rpc error: code = NotFound desc = could not find container \"3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02\": container with ID starting with 3f2b83c009b26f7f8af0a09aa3711004bf87f521580b1c4b9462dc127a4dba02 not found: ID does not exist" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.472560 4790 scope.go:117] "RemoveContainer" containerID="85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.472812 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40\": container with ID starting with 85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40 not found: ID does not exist" containerID="85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.472882 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40"} err="failed to get container status \"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40\": rpc error: code = NotFound desc = could not find container \"85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40\": container with ID starting with 85f43c7fae333ced74f291f7b0fb71b19145f6e4a0c294c66cbee7f9fb65ee40 not found: ID does not exist" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.472901 4790 scope.go:117] "RemoveContainer" containerID="3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.473397 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e\": container with ID starting with 3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e not found: ID does not exist" containerID="3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.473466 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e"} err="failed to get container status \"3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e\": rpc error: code = NotFound desc = could not find container \"3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e\": container with ID starting with 3845f4fb6fc282172921cc1d52c7335d43f04f7b6804516b9db58b7af3884f5e not found: ID does not exist" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.686396 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" path="/var/lib/kubelet/pods/fbbfbc5f-c439-4458-a69e-f4c8e57f17ff/volumes" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.717696 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718224 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="extract-content" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718248 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="extract-content" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718282 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718289 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718300 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718307 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718316 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="extract-utilities" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718323 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="extract-utilities" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718340 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="extract-content" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718347 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="extract-content" Apr 06 13:13:27 crc kubenswrapper[4790]: E0406 13:13:27.718358 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="extract-utilities" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718363 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="extract-utilities" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718552 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c828dfb-8e31-4837-9091-65ced2f26bd4" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.718581 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbfbc5f-c439-4458-a69e-f4c8e57f17ff" containerName="registry-server" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.720034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.727814 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.766746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.767048 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.767322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthdz\" (UniqueName: \"kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.869506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.870052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.870329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.870592 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fthdz\" (UniqueName: \"kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.870760 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:27 crc kubenswrapper[4790]: I0406 13:13:27.893139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthdz\" (UniqueName: \"kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz\") pod \"redhat-marketplace-t2q2q\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:28 crc kubenswrapper[4790]: I0406 13:13:28.038319 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:28 crc kubenswrapper[4790]: I0406 13:13:28.557014 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:29 crc kubenswrapper[4790]: I0406 13:13:29.394941 4790 generic.go:334] "Generic (PLEG): container finished" podID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerID="89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4" exitCode=0 Apr 06 13:13:29 crc kubenswrapper[4790]: I0406 13:13:29.395013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerDied","Data":"89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4"} Apr 06 13:13:29 crc kubenswrapper[4790]: I0406 13:13:29.395271 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerStarted","Data":"d6e75cafde0364ca1b7bd00718d78f6b7d93fd26a74bfac2410249992dbac942"} Apr 06 13:13:30 crc kubenswrapper[4790]: I0406 13:13:30.449450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerStarted","Data":"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8"} Apr 06 13:13:31 crc kubenswrapper[4790]: I0406 13:13:31.460973 4790 generic.go:334] "Generic (PLEG): container finished" podID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerID="1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8" exitCode=0 Apr 06 13:13:31 crc kubenswrapper[4790]: I0406 13:13:31.461028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerDied","Data":"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8"} Apr 06 13:13:32 crc kubenswrapper[4790]: I0406 13:13:32.475249 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerStarted","Data":"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5"} Apr 06 13:13:32 crc kubenswrapper[4790]: I0406 13:13:32.499627 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t2q2q" podStartSLOduration=2.88865286 podStartE2EDuration="5.499609135s" podCreationTimestamp="2026-04-06 13:13:27 +0000 UTC" firstStartedPulling="2026-04-06 13:13:29.397578738 +0000 UTC m=+4588.385321614" lastFinishedPulling="2026-04-06 13:13:32.008535023 +0000 UTC m=+4590.996277889" observedRunningTime="2026-04-06 13:13:32.491803957 +0000 UTC m=+4591.479546873" watchObservedRunningTime="2026-04-06 13:13:32.499609135 +0000 UTC m=+4591.487352001" Apr 06 13:13:38 crc kubenswrapper[4790]: I0406 13:13:38.038918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:38 crc kubenswrapper[4790]: I0406 13:13:38.039281 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:38 crc kubenswrapper[4790]: I0406 13:13:38.086860 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:38 crc kubenswrapper[4790]: I0406 13:13:38.575339 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:38 crc kubenswrapper[4790]: I0406 13:13:38.620722 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:39 crc kubenswrapper[4790]: I0406 13:13:39.753937 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:13:39 crc kubenswrapper[4790]: I0406 13:13:39.754006 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:13:39 crc kubenswrapper[4790]: I0406 13:13:39.754059 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:13:39 crc kubenswrapper[4790]: I0406 13:13:39.754943 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:13:39 crc kubenswrapper[4790]: I0406 13:13:39.755013 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" gracePeriod=600 Apr 06 13:13:40 crc kubenswrapper[4790]: E0406 13:13:40.172637 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:13:40 crc kubenswrapper[4790]: I0406 13:13:40.548566 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" exitCode=0 Apr 06 13:13:40 crc kubenswrapper[4790]: I0406 13:13:40.548623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270"} Apr 06 13:13:40 crc kubenswrapper[4790]: I0406 13:13:40.548931 4790 scope.go:117] "RemoveContainer" containerID="3cf8615d931e80e8cb6c3ab9e8c979cb735e71e0536e5ac0d22323ffb18dac42" Apr 06 13:13:40 crc kubenswrapper[4790]: I0406 13:13:40.549071 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t2q2q" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="registry-server" containerID="cri-o://6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5" gracePeriod=2 Apr 06 13:13:40 crc kubenswrapper[4790]: I0406 13:13:40.550130 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:13:40 crc kubenswrapper[4790]: E0406 13:13:40.550388 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.118621 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.246031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fthdz\" (UniqueName: \"kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz\") pod \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.246120 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities\") pod \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.246199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content\") pod \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\" (UID: \"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc\") " Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.246975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities" (OuterVolumeSpecName: "utilities") pod "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" (UID: "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.251283 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz" (OuterVolumeSpecName: "kube-api-access-fthdz") pod "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" (UID: "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc"). InnerVolumeSpecName "kube-api-access-fthdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.279786 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" (UID: "f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.348751 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.348791 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fthdz\" (UniqueName: \"kubernetes.io/projected/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-kube-api-access-fthdz\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.348805 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.563797 4790 generic.go:334] "Generic (PLEG): container finished" podID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerID="6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5" exitCode=0 Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.563860 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerDied","Data":"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5"} Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.563891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t2q2q" event={"ID":"f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc","Type":"ContainerDied","Data":"d6e75cafde0364ca1b7bd00718d78f6b7d93fd26a74bfac2410249992dbac942"} Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.563887 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t2q2q" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.563909 4790 scope.go:117] "RemoveContainer" containerID="6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.595311 4790 scope.go:117] "RemoveContainer" containerID="1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8" Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.607597 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.617317 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t2q2q"] Apr 06 13:13:41 crc kubenswrapper[4790]: I0406 13:13:41.693797 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" path="/var/lib/kubelet/pods/f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc/volumes" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.162341 4790 scope.go:117] "RemoveContainer" containerID="89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.309155 4790 scope.go:117] "RemoveContainer" containerID="6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5" Apr 06 13:13:42 crc kubenswrapper[4790]: E0406 13:13:42.309671 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5\": container with ID starting with 6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5 not found: ID does not exist" containerID="6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.309728 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5"} err="failed to get container status \"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5\": rpc error: code = NotFound desc = could not find container \"6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5\": container with ID starting with 6a1ab0352171caf54fa4f6e1195197703cbd980cf430ad575cd6a6dfa5124de5 not found: ID does not exist" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.309762 4790 scope.go:117] "RemoveContainer" containerID="1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8" Apr 06 13:13:42 crc kubenswrapper[4790]: E0406 13:13:42.310191 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8\": container with ID starting with 1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8 not found: ID does not exist" containerID="1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.310222 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8"} err="failed to get container status \"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8\": rpc error: code = NotFound desc = could not find container \"1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8\": container with ID starting with 1e4f8506bb55e6012561998dd326287c1c60c62eb09ebd2000a1abd07d4c85c8 not found: ID does not exist" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.310243 4790 scope.go:117] "RemoveContainer" containerID="89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4" Apr 06 13:13:42 crc kubenswrapper[4790]: E0406 13:13:42.310627 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4\": container with ID starting with 89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4 not found: ID does not exist" containerID="89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4" Apr 06 13:13:42 crc kubenswrapper[4790]: I0406 13:13:42.310671 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4"} err="failed to get container status \"89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4\": rpc error: code = NotFound desc = could not find container \"89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4\": container with ID starting with 89dab3072e20e62ab20752f6d167c9436b96d8a3a4ababc9a943746e121213b4 not found: ID does not exist" Apr 06 13:13:52 crc kubenswrapper[4790]: I0406 13:13:52.675622 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:13:52 crc kubenswrapper[4790]: E0406 13:13:52.676414 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.144551 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591354-4vfkm"] Apr 06 13:14:00 crc kubenswrapper[4790]: E0406 13:14:00.145605 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="registry-server" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.145625 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="registry-server" Apr 06 13:14:00 crc kubenswrapper[4790]: E0406 13:14:00.145665 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="extract-utilities" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.145674 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="extract-utilities" Apr 06 13:14:00 crc kubenswrapper[4790]: E0406 13:14:00.145696 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="extract-content" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.145703 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="extract-content" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.145972 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f609f9c9-cae7-4e9d-b9c0-10158ffc3dcc" containerName="registry-server" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.146800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.148916 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.149056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.151103 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.153796 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591354-4vfkm"] Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.241103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6qwz\" (UniqueName: \"kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz\") pod \"auto-csr-approver-29591354-4vfkm\" (UID: \"8e9230d8-d435-4c19-80b5-c8f3cfb6b793\") " pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.343091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6qwz\" (UniqueName: \"kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz\") pod \"auto-csr-approver-29591354-4vfkm\" (UID: \"8e9230d8-d435-4c19-80b5-c8f3cfb6b793\") " pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.639998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6qwz\" (UniqueName: \"kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz\") pod \"auto-csr-approver-29591354-4vfkm\" (UID: \"8e9230d8-d435-4c19-80b5-c8f3cfb6b793\") " pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:00 crc kubenswrapper[4790]: I0406 13:14:00.767084 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:01 crc kubenswrapper[4790]: I0406 13:14:01.253816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591354-4vfkm"] Apr 06 13:14:01 crc kubenswrapper[4790]: I0406 13:14:01.770928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" event={"ID":"8e9230d8-d435-4c19-80b5-c8f3cfb6b793","Type":"ContainerStarted","Data":"6bb94f2cc7d8f20e9b19408a93ab0b6b0e61521eea5851169c419ae52ac02307"} Apr 06 13:14:03 crc kubenswrapper[4790]: I0406 13:14:03.793513 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e9230d8-d435-4c19-80b5-c8f3cfb6b793" containerID="a571c5ec1f605810ac1babd84f729f6ac5217facaabc7e8af1c7262bcc764da0" exitCode=0 Apr 06 13:14:03 crc kubenswrapper[4790]: I0406 13:14:03.794468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" event={"ID":"8e9230d8-d435-4c19-80b5-c8f3cfb6b793","Type":"ContainerDied","Data":"a571c5ec1f605810ac1babd84f729f6ac5217facaabc7e8af1c7262bcc764da0"} Apr 06 13:14:04 crc kubenswrapper[4790]: I0406 13:14:04.675842 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:14:04 crc kubenswrapper[4790]: E0406 13:14:04.676407 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.151947 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.261545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6qwz\" (UniqueName: \"kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz\") pod \"8e9230d8-d435-4c19-80b5-c8f3cfb6b793\" (UID: \"8e9230d8-d435-4c19-80b5-c8f3cfb6b793\") " Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.268218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz" (OuterVolumeSpecName: "kube-api-access-l6qwz") pod "8e9230d8-d435-4c19-80b5-c8f3cfb6b793" (UID: "8e9230d8-d435-4c19-80b5-c8f3cfb6b793"). InnerVolumeSpecName "kube-api-access-l6qwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.364313 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6qwz\" (UniqueName: \"kubernetes.io/projected/8e9230d8-d435-4c19-80b5-c8f3cfb6b793-kube-api-access-l6qwz\") on node \"crc\" DevicePath \"\"" Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.814085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" event={"ID":"8e9230d8-d435-4c19-80b5-c8f3cfb6b793","Type":"ContainerDied","Data":"6bb94f2cc7d8f20e9b19408a93ab0b6b0e61521eea5851169c419ae52ac02307"} Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.814127 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb94f2cc7d8f20e9b19408a93ab0b6b0e61521eea5851169c419ae52ac02307" Apr 06 13:14:05 crc kubenswrapper[4790]: I0406 13:14:05.814164 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591354-4vfkm" Apr 06 13:14:06 crc kubenswrapper[4790]: I0406 13:14:06.223638 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591348-c8v2l"] Apr 06 13:14:06 crc kubenswrapper[4790]: I0406 13:14:06.232986 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591348-c8v2l"] Apr 06 13:14:07 crc kubenswrapper[4790]: I0406 13:14:07.687280 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa241a8-092d-4bcb-8378-4a08991ffc2b" path="/var/lib/kubelet/pods/9aa241a8-092d-4bcb-8378-4a08991ffc2b/volumes" Apr 06 13:14:18 crc kubenswrapper[4790]: I0406 13:14:18.156433 4790 scope.go:117] "RemoveContainer" containerID="de5c26cd4bd3d9f60ce82b6a7021f4f0de31aa66f7aab31e1ba9c91d99b39d2f" Apr 06 13:14:19 crc kubenswrapper[4790]: I0406 13:14:19.676808 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:14:19 crc kubenswrapper[4790]: E0406 13:14:19.677978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:14:33 crc kubenswrapper[4790]: I0406 13:14:33.675899 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:14:33 crc kubenswrapper[4790]: E0406 13:14:33.676597 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:14:47 crc kubenswrapper[4790]: I0406 13:14:47.678578 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:14:47 crc kubenswrapper[4790]: E0406 13:14:47.679606 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.149830 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg"] Apr 06 13:15:00 crc kubenswrapper[4790]: E0406 13:15:00.150806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9230d8-d435-4c19-80b5-c8f3cfb6b793" containerName="oc" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.150819 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9230d8-d435-4c19-80b5-c8f3cfb6b793" containerName="oc" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.151082 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9230d8-d435-4c19-80b5-c8f3cfb6b793" containerName="oc" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.151758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.153608 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.153821 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.168635 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg"] Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.250999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.251726 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.251804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2m2c\" (UniqueName: \"kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.354433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.355413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2m2c\" (UniqueName: \"kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.355330 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.356052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.368866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.375704 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2m2c\" (UniqueName: \"kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c\") pod \"collect-profiles-29591355-r7gqg\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.481546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:00 crc kubenswrapper[4790]: I0406 13:15:00.961256 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg"] Apr 06 13:15:01 crc kubenswrapper[4790]: I0406 13:15:01.390043 4790 generic.go:334] "Generic (PLEG): container finished" podID="fae07b18-f923-48bb-ba4a-2e01093a512d" containerID="275b6604e0efce4bcbb44dca7ea59c59fdd49fdbaceeba020fbe697e32d2aa0b" exitCode=0 Apr 06 13:15:01 crc kubenswrapper[4790]: I0406 13:15:01.390215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" event={"ID":"fae07b18-f923-48bb-ba4a-2e01093a512d","Type":"ContainerDied","Data":"275b6604e0efce4bcbb44dca7ea59c59fdd49fdbaceeba020fbe697e32d2aa0b"} Apr 06 13:15:01 crc kubenswrapper[4790]: I0406 13:15:01.390286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" event={"ID":"fae07b18-f923-48bb-ba4a-2e01093a512d","Type":"ContainerStarted","Data":"a19980141e427b0268b4e3172ae17acc631f9ee51b081891bce42b9e360e6ead"} Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.675624 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:15:02 crc kubenswrapper[4790]: E0406 13:15:02.676450 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.777635 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.907022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume\") pod \"fae07b18-f923-48bb-ba4a-2e01093a512d\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.907336 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume\") pod \"fae07b18-f923-48bb-ba4a-2e01093a512d\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.907442 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2m2c\" (UniqueName: \"kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c\") pod \"fae07b18-f923-48bb-ba4a-2e01093a512d\" (UID: \"fae07b18-f923-48bb-ba4a-2e01093a512d\") " Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.908676 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume" (OuterVolumeSpecName: "config-volume") pod "fae07b18-f923-48bb-ba4a-2e01093a512d" (UID: "fae07b18-f923-48bb-ba4a-2e01093a512d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.913101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c" (OuterVolumeSpecName: "kube-api-access-v2m2c") pod "fae07b18-f923-48bb-ba4a-2e01093a512d" (UID: "fae07b18-f923-48bb-ba4a-2e01093a512d"). InnerVolumeSpecName "kube-api-access-v2m2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:15:02 crc kubenswrapper[4790]: I0406 13:15:02.913984 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fae07b18-f923-48bb-ba4a-2e01093a512d" (UID: "fae07b18-f923-48bb-ba4a-2e01093a512d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.010423 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fae07b18-f923-48bb-ba4a-2e01093a512d-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.010461 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2m2c\" (UniqueName: \"kubernetes.io/projected/fae07b18-f923-48bb-ba4a-2e01093a512d-kube-api-access-v2m2c\") on node \"crc\" DevicePath \"\"" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.010473 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fae07b18-f923-48bb-ba4a-2e01093a512d-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.413381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" event={"ID":"fae07b18-f923-48bb-ba4a-2e01093a512d","Type":"ContainerDied","Data":"a19980141e427b0268b4e3172ae17acc631f9ee51b081891bce42b9e360e6ead"} Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.413417 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19980141e427b0268b4e3172ae17acc631f9ee51b081891bce42b9e360e6ead" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.413473 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591355-r7gqg" Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.845212 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4"] Apr 06 13:15:03 crc kubenswrapper[4790]: I0406 13:15:03.855473 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591310-d74b4"] Apr 06 13:15:05 crc kubenswrapper[4790]: I0406 13:15:05.686016 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68adc8df-106b-4049-82b1-e3ad4c500432" path="/var/lib/kubelet/pods/68adc8df-106b-4049-82b1-e3ad4c500432/volumes" Apr 06 13:15:14 crc kubenswrapper[4790]: I0406 13:15:14.676364 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:15:14 crc kubenswrapper[4790]: E0406 13:15:14.677263 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:15:18 crc kubenswrapper[4790]: I0406 13:15:18.258872 4790 scope.go:117] "RemoveContainer" containerID="286661b6448b30627ccf03fb6fc4e3c8e9eda66d2a1b2e4d20379188cddee549" Apr 06 13:15:28 crc kubenswrapper[4790]: I0406 13:15:28.676146 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:15:28 crc kubenswrapper[4790]: E0406 13:15:28.676819 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:15:42 crc kubenswrapper[4790]: I0406 13:15:42.676365 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:15:42 crc kubenswrapper[4790]: E0406 13:15:42.677368 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:15:57 crc kubenswrapper[4790]: I0406 13:15:57.676128 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:15:57 crc kubenswrapper[4790]: E0406 13:15:57.676939 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.154705 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591356-bsq8q"] Apr 06 13:16:00 crc kubenswrapper[4790]: E0406 13:16:00.155517 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae07b18-f923-48bb-ba4a-2e01093a512d" containerName="collect-profiles" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.155530 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae07b18-f923-48bb-ba4a-2e01093a512d" containerName="collect-profiles" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.155753 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae07b18-f923-48bb-ba4a-2e01093a512d" containerName="collect-profiles" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.156496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.158813 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.159154 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.169760 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.177225 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591356-bsq8q"] Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.320695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4gf\" (UniqueName: \"kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf\") pod \"auto-csr-approver-29591356-bsq8q\" (UID: \"4ca1e868-5bd3-4425-808f-161104870567\") " pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.435780 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4gf\" (UniqueName: \"kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf\") pod \"auto-csr-approver-29591356-bsq8q\" (UID: \"4ca1e868-5bd3-4425-808f-161104870567\") " pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.461085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4gf\" (UniqueName: \"kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf\") pod \"auto-csr-approver-29591356-bsq8q\" (UID: \"4ca1e868-5bd3-4425-808f-161104870567\") " pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.483183 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.941038 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591356-bsq8q"] Apr 06 13:16:00 crc kubenswrapper[4790]: I0406 13:16:00.992129 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" event={"ID":"4ca1e868-5bd3-4425-808f-161104870567","Type":"ContainerStarted","Data":"3f87a04161b01e7298c909275a0ce3094ee21b048b7946ac4349b12c69b49c33"} Apr 06 13:16:03 crc kubenswrapper[4790]: I0406 13:16:03.016354 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ca1e868-5bd3-4425-808f-161104870567" containerID="31680429200abe423affbd10c74d74abacc16ec93c3a8d50ae05296440f272af" exitCode=0 Apr 06 13:16:03 crc kubenswrapper[4790]: I0406 13:16:03.016755 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" event={"ID":"4ca1e868-5bd3-4425-808f-161104870567","Type":"ContainerDied","Data":"31680429200abe423affbd10c74d74abacc16ec93c3a8d50ae05296440f272af"} Apr 06 13:16:04 crc kubenswrapper[4790]: I0406 13:16:04.425517 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:04 crc kubenswrapper[4790]: I0406 13:16:04.520420 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn4gf\" (UniqueName: \"kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf\") pod \"4ca1e868-5bd3-4425-808f-161104870567\" (UID: \"4ca1e868-5bd3-4425-808f-161104870567\") " Apr 06 13:16:04 crc kubenswrapper[4790]: I0406 13:16:04.532175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf" (OuterVolumeSpecName: "kube-api-access-sn4gf") pod "4ca1e868-5bd3-4425-808f-161104870567" (UID: "4ca1e868-5bd3-4425-808f-161104870567"). InnerVolumeSpecName "kube-api-access-sn4gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:16:04 crc kubenswrapper[4790]: I0406 13:16:04.622871 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn4gf\" (UniqueName: \"kubernetes.io/projected/4ca1e868-5bd3-4425-808f-161104870567-kube-api-access-sn4gf\") on node \"crc\" DevicePath \"\"" Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.036645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" event={"ID":"4ca1e868-5bd3-4425-808f-161104870567","Type":"ContainerDied","Data":"3f87a04161b01e7298c909275a0ce3094ee21b048b7946ac4349b12c69b49c33"} Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.036788 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f87a04161b01e7298c909275a0ce3094ee21b048b7946ac4349b12c69b49c33" Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.036706 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591356-bsq8q" Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.492205 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591350-mfz2m"] Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.503816 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591350-mfz2m"] Apr 06 13:16:05 crc kubenswrapper[4790]: I0406 13:16:05.693559 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482b36aa-c70d-4583-94a3-8cceeac309e5" path="/var/lib/kubelet/pods/482b36aa-c70d-4583-94a3-8cceeac309e5/volumes" Apr 06 13:16:10 crc kubenswrapper[4790]: I0406 13:16:10.675681 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:16:10 crc kubenswrapper[4790]: E0406 13:16:10.677361 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:16:18 crc kubenswrapper[4790]: I0406 13:16:18.324788 4790 scope.go:117] "RemoveContainer" containerID="15a8cda7cd80cb4e48e1d3ca04cee5da5c682e7fec28dd4b004e1238dc5fe772" Apr 06 13:16:25 crc kubenswrapper[4790]: I0406 13:16:25.676199 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:16:25 crc kubenswrapper[4790]: E0406 13:16:25.677272 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:16:39 crc kubenswrapper[4790]: I0406 13:16:39.687054 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:16:39 crc kubenswrapper[4790]: E0406 13:16:39.688011 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:16:51 crc kubenswrapper[4790]: I0406 13:16:51.675901 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:16:51 crc kubenswrapper[4790]: E0406 13:16:51.676568 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:17:04 crc kubenswrapper[4790]: I0406 13:17:04.676780 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:17:04 crc kubenswrapper[4790]: E0406 13:17:04.677595 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:17:16 crc kubenswrapper[4790]: I0406 13:17:16.675885 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:17:16 crc kubenswrapper[4790]: E0406 13:17:16.676659 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:17:29 crc kubenswrapper[4790]: I0406 13:17:29.676212 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:17:29 crc kubenswrapper[4790]: E0406 13:17:29.677050 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:17:40 crc kubenswrapper[4790]: I0406 13:17:40.676024 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:17:40 crc kubenswrapper[4790]: E0406 13:17:40.676822 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:17:53 crc kubenswrapper[4790]: I0406 13:17:53.676484 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:17:53 crc kubenswrapper[4790]: E0406 13:17:53.677294 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.158028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591358-t7lc8"] Apr 06 13:18:00 crc kubenswrapper[4790]: E0406 13:18:00.159787 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca1e868-5bd3-4425-808f-161104870567" containerName="oc" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.159823 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca1e868-5bd3-4425-808f-161104870567" containerName="oc" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.160413 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca1e868-5bd3-4425-808f-161104870567" containerName="oc" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.162047 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.166751 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.166749 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.167454 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.170051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591358-t7lc8"] Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.236245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrplp\" (UniqueName: \"kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp\") pod \"auto-csr-approver-29591358-t7lc8\" (UID: \"bff2494f-96ec-4396-9d68-32db623078e8\") " pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.339274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrplp\" (UniqueName: \"kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp\") pod \"auto-csr-approver-29591358-t7lc8\" (UID: \"bff2494f-96ec-4396-9d68-32db623078e8\") " pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.369376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrplp\" (UniqueName: \"kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp\") pod \"auto-csr-approver-29591358-t7lc8\" (UID: \"bff2494f-96ec-4396-9d68-32db623078e8\") " pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:00 crc kubenswrapper[4790]: I0406 13:18:00.505461 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:01 crc kubenswrapper[4790]: I0406 13:18:01.095336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591358-t7lc8"] Apr 06 13:18:01 crc kubenswrapper[4790]: I0406 13:18:01.141935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" event={"ID":"bff2494f-96ec-4396-9d68-32db623078e8","Type":"ContainerStarted","Data":"5cf2d2706f89553e4004287c06b58f3c415d04fe4e1b55b0e42d75af8aca4165"} Apr 06 13:18:02 crc kubenswrapper[4790]: I0406 13:18:02.154480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" event={"ID":"bff2494f-96ec-4396-9d68-32db623078e8","Type":"ContainerStarted","Data":"76d4cf2ffef5709a80b033be7f5ff860b8106ddf6abbf8c410d2485a4ed7a710"} Apr 06 13:18:02 crc kubenswrapper[4790]: I0406 13:18:02.175140 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" podStartSLOduration=1.396881021 podStartE2EDuration="2.175092389s" podCreationTimestamp="2026-04-06 13:18:00 +0000 UTC" firstStartedPulling="2026-04-06 13:18:01.097013658 +0000 UTC m=+4860.084756524" lastFinishedPulling="2026-04-06 13:18:01.875225026 +0000 UTC m=+4860.862967892" observedRunningTime="2026-04-06 13:18:02.168982926 +0000 UTC m=+4861.156725792" watchObservedRunningTime="2026-04-06 13:18:02.175092389 +0000 UTC m=+4861.162835265" Apr 06 13:18:03 crc kubenswrapper[4790]: I0406 13:18:03.165778 4790 generic.go:334] "Generic (PLEG): container finished" podID="bff2494f-96ec-4396-9d68-32db623078e8" containerID="76d4cf2ffef5709a80b033be7f5ff860b8106ddf6abbf8c410d2485a4ed7a710" exitCode=0 Apr 06 13:18:03 crc kubenswrapper[4790]: I0406 13:18:03.165846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" event={"ID":"bff2494f-96ec-4396-9d68-32db623078e8","Type":"ContainerDied","Data":"76d4cf2ffef5709a80b033be7f5ff860b8106ddf6abbf8c410d2485a4ed7a710"} Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.066919 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.132883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrplp\" (UniqueName: \"kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp\") pod \"bff2494f-96ec-4396-9d68-32db623078e8\" (UID: \"bff2494f-96ec-4396-9d68-32db623078e8\") " Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.154165 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp" (OuterVolumeSpecName: "kube-api-access-nrplp") pod "bff2494f-96ec-4396-9d68-32db623078e8" (UID: "bff2494f-96ec-4396-9d68-32db623078e8"). InnerVolumeSpecName "kube-api-access-nrplp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.188542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" event={"ID":"bff2494f-96ec-4396-9d68-32db623078e8","Type":"ContainerDied","Data":"5cf2d2706f89553e4004287c06b58f3c415d04fe4e1b55b0e42d75af8aca4165"} Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.188585 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf2d2706f89553e4004287c06b58f3c415d04fe4e1b55b0e42d75af8aca4165" Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.188647 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591358-t7lc8" Apr 06 13:18:05 crc kubenswrapper[4790]: I0406 13:18:05.235530 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrplp\" (UniqueName: \"kubernetes.io/projected/bff2494f-96ec-4396-9d68-32db623078e8-kube-api-access-nrplp\") on node \"crc\" DevicePath \"\"" Apr 06 13:18:06 crc kubenswrapper[4790]: I0406 13:18:06.161593 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591352-phnkb"] Apr 06 13:18:06 crc kubenswrapper[4790]: I0406 13:18:06.181217 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591352-phnkb"] Apr 06 13:18:07 crc kubenswrapper[4790]: I0406 13:18:07.676434 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:18:07 crc kubenswrapper[4790]: E0406 13:18:07.677229 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:18:07 crc kubenswrapper[4790]: I0406 13:18:07.689157 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2b0c38-6596-44b6-8955-75b3aa8455d5" path="/var/lib/kubelet/pods/db2b0c38-6596-44b6-8955-75b3aa8455d5/volumes" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.470532 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:17 crc kubenswrapper[4790]: E0406 13:18:17.471488 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff2494f-96ec-4396-9d68-32db623078e8" containerName="oc" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.471500 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff2494f-96ec-4396-9d68-32db623078e8" containerName="oc" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.471683 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff2494f-96ec-4396-9d68-32db623078e8" containerName="oc" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.473120 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.490620 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.600550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.600681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vnx8\" (UniqueName: \"kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.600720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.702633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.702809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vnx8\" (UniqueName: \"kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.702893 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.703337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.703555 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.728726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vnx8\" (UniqueName: \"kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8\") pod \"community-operators-f45s4\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:17 crc kubenswrapper[4790]: I0406 13:18:17.793203 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:18 crc kubenswrapper[4790]: I0406 13:18:18.423235 4790 scope.go:117] "RemoveContainer" containerID="9a07aca31f69a0b055140458e368808d34688b32ee86886b877d7a4c13c308e5" Apr 06 13:18:18 crc kubenswrapper[4790]: I0406 13:18:18.676622 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:18:18 crc kubenswrapper[4790]: E0406 13:18:18.676901 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:18:18 crc kubenswrapper[4790]: I0406 13:18:18.719442 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:19 crc kubenswrapper[4790]: I0406 13:18:19.356575 4790 generic.go:334] "Generic (PLEG): container finished" podID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerID="eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c" exitCode=0 Apr 06 13:18:19 crc kubenswrapper[4790]: I0406 13:18:19.356661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerDied","Data":"eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c"} Apr 06 13:18:19 crc kubenswrapper[4790]: I0406 13:18:19.356916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerStarted","Data":"cb955eb5691559f186a2c89226defb5127d97a39de808a4222d0a05674e4b363"} Apr 06 13:18:19 crc kubenswrapper[4790]: I0406 13:18:19.394240 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:18:20 crc kubenswrapper[4790]: I0406 13:18:20.367142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerStarted","Data":"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f"} Apr 06 13:18:21 crc kubenswrapper[4790]: I0406 13:18:21.383500 4790 generic.go:334] "Generic (PLEG): container finished" podID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerID="fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f" exitCode=0 Apr 06 13:18:21 crc kubenswrapper[4790]: I0406 13:18:21.383608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerDied","Data":"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f"} Apr 06 13:18:22 crc kubenswrapper[4790]: I0406 13:18:22.400281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerStarted","Data":"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6"} Apr 06 13:18:22 crc kubenswrapper[4790]: I0406 13:18:22.434015 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f45s4" podStartSLOduration=2.934768566 podStartE2EDuration="5.433988596s" podCreationTimestamp="2026-04-06 13:18:17 +0000 UTC" firstStartedPulling="2026-04-06 13:18:19.393968623 +0000 UTC m=+4878.381711499" lastFinishedPulling="2026-04-06 13:18:21.893188673 +0000 UTC m=+4880.880931529" observedRunningTime="2026-04-06 13:18:22.423607869 +0000 UTC m=+4881.411350745" watchObservedRunningTime="2026-04-06 13:18:22.433988596 +0000 UTC m=+4881.421731482" Apr 06 13:18:27 crc kubenswrapper[4790]: I0406 13:18:27.794354 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:27 crc kubenswrapper[4790]: I0406 13:18:27.795317 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:27 crc kubenswrapper[4790]: I0406 13:18:27.865148 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:28 crc kubenswrapper[4790]: I0406 13:18:28.505814 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:28 crc kubenswrapper[4790]: I0406 13:18:28.561906 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:30 crc kubenswrapper[4790]: I0406 13:18:30.477686 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f45s4" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="registry-server" containerID="cri-o://27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6" gracePeriod=2 Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.004250 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.079772 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content\") pod \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.080052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vnx8\" (UniqueName: \"kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8\") pod \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.080149 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities\") pod \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\" (UID: \"47a8072c-f273-4d6f-b072-2cde9c3ceb2e\") " Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.081430 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities" (OuterVolumeSpecName: "utilities") pod "47a8072c-f273-4d6f-b072-2cde9c3ceb2e" (UID: "47a8072c-f273-4d6f-b072-2cde9c3ceb2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.086445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8" (OuterVolumeSpecName: "kube-api-access-8vnx8") pod "47a8072c-f273-4d6f-b072-2cde9c3ceb2e" (UID: "47a8072c-f273-4d6f-b072-2cde9c3ceb2e"). InnerVolumeSpecName "kube-api-access-8vnx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.182499 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vnx8\" (UniqueName: \"kubernetes.io/projected/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-kube-api-access-8vnx8\") on node \"crc\" DevicePath \"\"" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.182538 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.229867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47a8072c-f273-4d6f-b072-2cde9c3ceb2e" (UID: "47a8072c-f273-4d6f-b072-2cde9c3ceb2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.284599 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47a8072c-f273-4d6f-b072-2cde9c3ceb2e-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.487799 4790 generic.go:334] "Generic (PLEG): container finished" podID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerID="27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6" exitCode=0 Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.487907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f45s4" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.487889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerDied","Data":"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6"} Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.488034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f45s4" event={"ID":"47a8072c-f273-4d6f-b072-2cde9c3ceb2e","Type":"ContainerDied","Data":"cb955eb5691559f186a2c89226defb5127d97a39de808a4222d0a05674e4b363"} Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.488058 4790 scope.go:117] "RemoveContainer" containerID="27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.508823 4790 scope.go:117] "RemoveContainer" containerID="fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.527934 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.538958 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f45s4"] Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.549138 4790 scope.go:117] "RemoveContainer" containerID="eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.610945 4790 scope.go:117] "RemoveContainer" containerID="27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6" Apr 06 13:18:31 crc kubenswrapper[4790]: E0406 13:18:31.611515 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6\": container with ID starting with 27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6 not found: ID does not exist" containerID="27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.611557 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6"} err="failed to get container status \"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6\": rpc error: code = NotFound desc = could not find container \"27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6\": container with ID starting with 27edc2e70d6053d7643918af902548116a09ed30a71bfc7ca2dbd136319a86a6 not found: ID does not exist" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.611580 4790 scope.go:117] "RemoveContainer" containerID="fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f" Apr 06 13:18:31 crc kubenswrapper[4790]: E0406 13:18:31.611837 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f\": container with ID starting with fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f not found: ID does not exist" containerID="fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.611863 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f"} err="failed to get container status \"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f\": rpc error: code = NotFound desc = could not find container \"fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f\": container with ID starting with fc3f15325e09c679338c24ce1e6ede2149ee9c22bc6fc7afb09a67d6cd38fd1f not found: ID does not exist" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.611879 4790 scope.go:117] "RemoveContainer" containerID="eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c" Apr 06 13:18:31 crc kubenswrapper[4790]: E0406 13:18:31.612132 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c\": container with ID starting with eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c not found: ID does not exist" containerID="eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.612158 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c"} err="failed to get container status \"eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c\": rpc error: code = NotFound desc = could not find container \"eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c\": container with ID starting with eb04b70ffebf852e181bbddbebb7a0c5c0b001f006e9f907f68c458c13d1920c not found: ID does not exist" Apr 06 13:18:31 crc kubenswrapper[4790]: I0406 13:18:31.692409 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" path="/var/lib/kubelet/pods/47a8072c-f273-4d6f-b072-2cde9c3ceb2e/volumes" Apr 06 13:18:33 crc kubenswrapper[4790]: I0406 13:18:33.675823 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:18:33 crc kubenswrapper[4790]: E0406 13:18:33.677068 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:18:46 crc kubenswrapper[4790]: I0406 13:18:46.675339 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:18:47 crc kubenswrapper[4790]: I0406 13:18:47.664035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef"} Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.141587 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591360-qs6lq"] Apr 06 13:20:00 crc kubenswrapper[4790]: E0406 13:20:00.142534 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="extract-utilities" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.142545 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="extract-utilities" Apr 06 13:20:00 crc kubenswrapper[4790]: E0406 13:20:00.142562 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="extract-content" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.142568 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="extract-content" Apr 06 13:20:00 crc kubenswrapper[4790]: E0406 13:20:00.142588 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="registry-server" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.142594 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="registry-server" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.142800 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a8072c-f273-4d6f-b072-2cde9c3ceb2e" containerName="registry-server" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.143488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.145817 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.146861 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.147028 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.151293 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591360-qs6lq"] Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.229581 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n7x\" (UniqueName: \"kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x\") pod \"auto-csr-approver-29591360-qs6lq\" (UID: \"33de3a88-1fc3-4824-a7e7-e3f7019ac216\") " pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.331304 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n7x\" (UniqueName: \"kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x\") pod \"auto-csr-approver-29591360-qs6lq\" (UID: \"33de3a88-1fc3-4824-a7e7-e3f7019ac216\") " pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.355661 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n7x\" (UniqueName: \"kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x\") pod \"auto-csr-approver-29591360-qs6lq\" (UID: \"33de3a88-1fc3-4824-a7e7-e3f7019ac216\") " pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.503738 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:00 crc kubenswrapper[4790]: I0406 13:20:00.919158 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591360-qs6lq"] Apr 06 13:20:01 crc kubenswrapper[4790]: I0406 13:20:01.400635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" event={"ID":"33de3a88-1fc3-4824-a7e7-e3f7019ac216","Type":"ContainerStarted","Data":"0dab94556a2c430c3d758cc6df63e8e25be557af537dcd59ce84283112fd8195"} Apr 06 13:20:02 crc kubenswrapper[4790]: I0406 13:20:02.416437 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" event={"ID":"33de3a88-1fc3-4824-a7e7-e3f7019ac216","Type":"ContainerStarted","Data":"fc0475e7dceaf3f821110ae00fa5833c8230550ad28b77c217c5fe0b6a93aedd"} Apr 06 13:20:02 crc kubenswrapper[4790]: I0406 13:20:02.433913 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" podStartSLOduration=1.3955234380000001 podStartE2EDuration="2.43389592s" podCreationTimestamp="2026-04-06 13:20:00 +0000 UTC" firstStartedPulling="2026-04-06 13:20:00.937761011 +0000 UTC m=+4979.925503877" lastFinishedPulling="2026-04-06 13:20:01.976133503 +0000 UTC m=+4980.963876359" observedRunningTime="2026-04-06 13:20:02.432119773 +0000 UTC m=+4981.419862639" watchObservedRunningTime="2026-04-06 13:20:02.43389592 +0000 UTC m=+4981.421638796" Apr 06 13:20:03 crc kubenswrapper[4790]: I0406 13:20:03.428360 4790 generic.go:334] "Generic (PLEG): container finished" podID="33de3a88-1fc3-4824-a7e7-e3f7019ac216" containerID="fc0475e7dceaf3f821110ae00fa5833c8230550ad28b77c217c5fe0b6a93aedd" exitCode=0 Apr 06 13:20:03 crc kubenswrapper[4790]: I0406 13:20:03.428408 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" event={"ID":"33de3a88-1fc3-4824-a7e7-e3f7019ac216","Type":"ContainerDied","Data":"fc0475e7dceaf3f821110ae00fa5833c8230550ad28b77c217c5fe0b6a93aedd"} Apr 06 13:20:04 crc kubenswrapper[4790]: I0406 13:20:04.822728 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:04 crc kubenswrapper[4790]: I0406 13:20:04.935949 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2n7x\" (UniqueName: \"kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x\") pod \"33de3a88-1fc3-4824-a7e7-e3f7019ac216\" (UID: \"33de3a88-1fc3-4824-a7e7-e3f7019ac216\") " Apr 06 13:20:04 crc kubenswrapper[4790]: I0406 13:20:04.942773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x" (OuterVolumeSpecName: "kube-api-access-f2n7x") pod "33de3a88-1fc3-4824-a7e7-e3f7019ac216" (UID: "33de3a88-1fc3-4824-a7e7-e3f7019ac216"). InnerVolumeSpecName "kube-api-access-f2n7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.038001 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2n7x\" (UniqueName: \"kubernetes.io/projected/33de3a88-1fc3-4824-a7e7-e3f7019ac216-kube-api-access-f2n7x\") on node \"crc\" DevicePath \"\"" Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.455040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" event={"ID":"33de3a88-1fc3-4824-a7e7-e3f7019ac216","Type":"ContainerDied","Data":"0dab94556a2c430c3d758cc6df63e8e25be557af537dcd59ce84283112fd8195"} Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.455322 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dab94556a2c430c3d758cc6df63e8e25be557af537dcd59ce84283112fd8195" Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.455099 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591360-qs6lq" Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.915303 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591354-4vfkm"] Apr 06 13:20:05 crc kubenswrapper[4790]: I0406 13:20:05.936224 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591354-4vfkm"] Apr 06 13:20:07 crc kubenswrapper[4790]: I0406 13:20:07.687309 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9230d8-d435-4c19-80b5-c8f3cfb6b793" path="/var/lib/kubelet/pods/8e9230d8-d435-4c19-80b5-c8f3cfb6b793/volumes" Apr 06 13:20:18 crc kubenswrapper[4790]: I0406 13:20:18.724981 4790 scope.go:117] "RemoveContainer" containerID="a571c5ec1f605810ac1babd84f729f6ac5217facaabc7e8af1c7262bcc764da0" Apr 06 13:21:09 crc kubenswrapper[4790]: I0406 13:21:09.753115 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:21:09 crc kubenswrapper[4790]: I0406 13:21:09.753640 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:21:39 crc kubenswrapper[4790]: I0406 13:21:39.753520 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:21:39 crc kubenswrapper[4790]: I0406 13:21:39.754219 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.140015 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591362-8bc7c"] Apr 06 13:22:00 crc kubenswrapper[4790]: E0406 13:22:00.141974 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33de3a88-1fc3-4824-a7e7-e3f7019ac216" containerName="oc" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.142053 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="33de3a88-1fc3-4824-a7e7-e3f7019ac216" containerName="oc" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.142335 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="33de3a88-1fc3-4824-a7e7-e3f7019ac216" containerName="oc" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.143099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.145300 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.145503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.145795 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.150559 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591362-8bc7c"] Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.287416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmln9\" (UniqueName: \"kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9\") pod \"auto-csr-approver-29591362-8bc7c\" (UID: \"e9b3d820-e63f-4c97-98df-f9f597180d92\") " pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.389485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmln9\" (UniqueName: \"kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9\") pod \"auto-csr-approver-29591362-8bc7c\" (UID: \"e9b3d820-e63f-4c97-98df-f9f597180d92\") " pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.436794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmln9\" (UniqueName: \"kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9\") pod \"auto-csr-approver-29591362-8bc7c\" (UID: \"e9b3d820-e63f-4c97-98df-f9f597180d92\") " pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.462365 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:00 crc kubenswrapper[4790]: I0406 13:22:00.920073 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591362-8bc7c"] Apr 06 13:22:01 crc kubenswrapper[4790]: I0406 13:22:01.632749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" event={"ID":"e9b3d820-e63f-4c97-98df-f9f597180d92","Type":"ContainerStarted","Data":"05ce18d6e1075199a930043051dca8b82f9ab5a90c97838e0a5251b730a35df4"} Apr 06 13:22:03 crc kubenswrapper[4790]: I0406 13:22:03.651926 4790 generic.go:334] "Generic (PLEG): container finished" podID="e9b3d820-e63f-4c97-98df-f9f597180d92" containerID="e56ab918aac3c5b2a1e08e7bbc4b90bc9f2db0149b3bdd65d01608ea3bbfe146" exitCode=0 Apr 06 13:22:03 crc kubenswrapper[4790]: I0406 13:22:03.652025 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" event={"ID":"e9b3d820-e63f-4c97-98df-f9f597180d92","Type":"ContainerDied","Data":"e56ab918aac3c5b2a1e08e7bbc4b90bc9f2db0149b3bdd65d01608ea3bbfe146"} Apr 06 13:22:05 crc kubenswrapper[4790]: I0406 13:22:05.695223 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" event={"ID":"e9b3d820-e63f-4c97-98df-f9f597180d92","Type":"ContainerDied","Data":"05ce18d6e1075199a930043051dca8b82f9ab5a90c97838e0a5251b730a35df4"} Apr 06 13:22:05 crc kubenswrapper[4790]: I0406 13:22:05.696031 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05ce18d6e1075199a930043051dca8b82f9ab5a90c97838e0a5251b730a35df4" Apr 06 13:22:05 crc kubenswrapper[4790]: I0406 13:22:05.750041 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:05 crc kubenswrapper[4790]: I0406 13:22:05.902331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmln9\" (UniqueName: \"kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9\") pod \"e9b3d820-e63f-4c97-98df-f9f597180d92\" (UID: \"e9b3d820-e63f-4c97-98df-f9f597180d92\") " Apr 06 13:22:05 crc kubenswrapper[4790]: I0406 13:22:05.914099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9" (OuterVolumeSpecName: "kube-api-access-rmln9") pod "e9b3d820-e63f-4c97-98df-f9f597180d92" (UID: "e9b3d820-e63f-4c97-98df-f9f597180d92"). InnerVolumeSpecName "kube-api-access-rmln9". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:22:06 crc kubenswrapper[4790]: I0406 13:22:06.004444 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmln9\" (UniqueName: \"kubernetes.io/projected/e9b3d820-e63f-4c97-98df-f9f597180d92-kube-api-access-rmln9\") on node \"crc\" DevicePath \"\"" Apr 06 13:22:06 crc kubenswrapper[4790]: I0406 13:22:06.704889 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591362-8bc7c" Apr 06 13:22:06 crc kubenswrapper[4790]: I0406 13:22:06.823477 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591356-bsq8q"] Apr 06 13:22:06 crc kubenswrapper[4790]: I0406 13:22:06.833570 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591356-bsq8q"] Apr 06 13:22:07 crc kubenswrapper[4790]: I0406 13:22:07.688244 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca1e868-5bd3-4425-808f-161104870567" path="/var/lib/kubelet/pods/4ca1e868-5bd3-4425-808f-161104870567/volumes" Apr 06 13:22:10 crc kubenswrapper[4790]: I0406 13:22:10.179695 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:22:10 crc kubenswrapper[4790]: I0406 13:22:10.180237 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:22:10 crc kubenswrapper[4790]: I0406 13:22:10.226318 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:22:10 crc kubenswrapper[4790]: I0406 13:22:10.227249 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:22:10 crc kubenswrapper[4790]: I0406 13:22:10.227317 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef" gracePeriod=600 Apr 06 13:22:10 crc kubenswrapper[4790]: E0406 13:22:10.500191 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5e33f8_0490_4219_8c40_526903de8e6f.slice/crio-720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5e33f8_0490_4219_8c40_526903de8e6f.slice/crio-conmon-720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef.scope\": RecentStats: unable to find data in memory cache]" Apr 06 13:22:11 crc kubenswrapper[4790]: I0406 13:22:11.226324 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef" exitCode=0 Apr 06 13:22:11 crc kubenswrapper[4790]: I0406 13:22:11.226404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef"} Apr 06 13:22:11 crc kubenswrapper[4790]: I0406 13:22:11.226925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200"} Apr 06 13:22:11 crc kubenswrapper[4790]: I0406 13:22:11.226944 4790 scope.go:117] "RemoveContainer" containerID="ef72648414a273567fe1b19e33c0a9032f3cc410ad71a02452dba0459ce64270" Apr 06 13:22:18 crc kubenswrapper[4790]: I0406 13:22:18.862075 4790 scope.go:117] "RemoveContainer" containerID="31680429200abe423affbd10c74d74abacc16ec93c3a8d50ae05296440f272af" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.738022 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:20 crc kubenswrapper[4790]: E0406 13:23:20.739425 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b3d820-e63f-4c97-98df-f9f597180d92" containerName="oc" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.739448 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b3d820-e63f-4c97-98df-f9f597180d92" containerName="oc" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.739686 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b3d820-e63f-4c97-98df-f9f597180d92" containerName="oc" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.741349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.751722 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.807663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.807764 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.807868 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.909428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.909524 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.910040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.910253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.910511 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:20 crc kubenswrapper[4790]: I0406 13:23:20.930188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt\") pod \"certified-operators-zk26v\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.063148 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.596392 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.894572 4790 generic.go:334] "Generic (PLEG): container finished" podID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerID="99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e" exitCode=0 Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.894608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerDied","Data":"99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e"} Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.894633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerStarted","Data":"6312abc9857d4b582d8fb596533e93afd773ef9365b917d281b982f221d9b9eb"} Apr 06 13:23:21 crc kubenswrapper[4790]: I0406 13:23:21.896469 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:23:22 crc kubenswrapper[4790]: I0406 13:23:22.917322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerStarted","Data":"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba"} Apr 06 13:23:24 crc kubenswrapper[4790]: I0406 13:23:24.937143 4790 generic.go:334] "Generic (PLEG): container finished" podID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerID="304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba" exitCode=0 Apr 06 13:23:24 crc kubenswrapper[4790]: I0406 13:23:24.937222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerDied","Data":"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba"} Apr 06 13:23:25 crc kubenswrapper[4790]: I0406 13:23:25.949581 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerStarted","Data":"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651"} Apr 06 13:23:25 crc kubenswrapper[4790]: I0406 13:23:25.970019 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zk26v" podStartSLOduration=2.525701533 podStartE2EDuration="5.970002941s" podCreationTimestamp="2026-04-06 13:23:20 +0000 UTC" firstStartedPulling="2026-04-06 13:23:21.896255992 +0000 UTC m=+5180.883998858" lastFinishedPulling="2026-04-06 13:23:25.3405574 +0000 UTC m=+5184.328300266" observedRunningTime="2026-04-06 13:23:25.967218397 +0000 UTC m=+5184.954961263" watchObservedRunningTime="2026-04-06 13:23:25.970002941 +0000 UTC m=+5184.957745807" Apr 06 13:23:31 crc kubenswrapper[4790]: I0406 13:23:31.064492 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:31 crc kubenswrapper[4790]: I0406 13:23:31.065147 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:31 crc kubenswrapper[4790]: I0406 13:23:31.119470 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:32 crc kubenswrapper[4790]: I0406 13:23:32.052501 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:32 crc kubenswrapper[4790]: I0406 13:23:32.104091 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.027231 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zk26v" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="registry-server" containerID="cri-o://caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651" gracePeriod=2 Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.508103 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.643786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities\") pod \"52874875-7ccd-405d-b522-dc3c4494b2cd\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.643879 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content\") pod \"52874875-7ccd-405d-b522-dc3c4494b2cd\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.644005 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt\") pod \"52874875-7ccd-405d-b522-dc3c4494b2cd\" (UID: \"52874875-7ccd-405d-b522-dc3c4494b2cd\") " Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.645163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities" (OuterVolumeSpecName: "utilities") pod "52874875-7ccd-405d-b522-dc3c4494b2cd" (UID: "52874875-7ccd-405d-b522-dc3c4494b2cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.653381 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt" (OuterVolumeSpecName: "kube-api-access-fltmt") pod "52874875-7ccd-405d-b522-dc3c4494b2cd" (UID: "52874875-7ccd-405d-b522-dc3c4494b2cd"). InnerVolumeSpecName "kube-api-access-fltmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.716706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52874875-7ccd-405d-b522-dc3c4494b2cd" (UID: "52874875-7ccd-405d-b522-dc3c4494b2cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.746167 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.746203 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52874875-7ccd-405d-b522-dc3c4494b2cd-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:23:34 crc kubenswrapper[4790]: I0406 13:23:34.746217 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fltmt\" (UniqueName: \"kubernetes.io/projected/52874875-7ccd-405d-b522-dc3c4494b2cd-kube-api-access-fltmt\") on node \"crc\" DevicePath \"\"" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.042717 4790 generic.go:334] "Generic (PLEG): container finished" podID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerID="caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651" exitCode=0 Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.042761 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zk26v" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.042771 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerDied","Data":"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651"} Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.044099 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zk26v" event={"ID":"52874875-7ccd-405d-b522-dc3c4494b2cd","Type":"ContainerDied","Data":"6312abc9857d4b582d8fb596533e93afd773ef9365b917d281b982f221d9b9eb"} Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.044133 4790 scope.go:117] "RemoveContainer" containerID="caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.067209 4790 scope.go:117] "RemoveContainer" containerID="304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.084104 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.095185 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zk26v"] Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.098063 4790 scope.go:117] "RemoveContainer" containerID="99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.138387 4790 scope.go:117] "RemoveContainer" containerID="caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651" Apr 06 13:23:35 crc kubenswrapper[4790]: E0406 13:23:35.139015 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651\": container with ID starting with caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651 not found: ID does not exist" containerID="caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.139069 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651"} err="failed to get container status \"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651\": rpc error: code = NotFound desc = could not find container \"caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651\": container with ID starting with caa8b4c8a9c7f22574cda067006028d82a3affad7b9fc3f4fbbad48d9992e651 not found: ID does not exist" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.139126 4790 scope.go:117] "RemoveContainer" containerID="304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba" Apr 06 13:23:35 crc kubenswrapper[4790]: E0406 13:23:35.139501 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba\": container with ID starting with 304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba not found: ID does not exist" containerID="304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.139529 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba"} err="failed to get container status \"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba\": rpc error: code = NotFound desc = could not find container \"304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba\": container with ID starting with 304cf70423004739ee8d01bd0d802045688671a89312adf6d519f5c15a58d1ba not found: ID does not exist" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.139544 4790 scope.go:117] "RemoveContainer" containerID="99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e" Apr 06 13:23:35 crc kubenswrapper[4790]: E0406 13:23:35.140037 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e\": container with ID starting with 99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e not found: ID does not exist" containerID="99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.140061 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e"} err="failed to get container status \"99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e\": rpc error: code = NotFound desc = could not find container \"99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e\": container with ID starting with 99cc6a1db567f4a9ac72c7f23b1dfa49fa5008f8613a3d5ff5d0e55e0c41002e not found: ID does not exist" Apr 06 13:23:35 crc kubenswrapper[4790]: I0406 13:23:35.689802 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" path="/var/lib/kubelet/pods/52874875-7ccd-405d-b522-dc3c4494b2cd/volumes" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.151556 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591364-t84b2"] Apr 06 13:24:00 crc kubenswrapper[4790]: E0406 13:24:00.152522 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="extract-utilities" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.152536 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="extract-utilities" Apr 06 13:24:00 crc kubenswrapper[4790]: E0406 13:24:00.152558 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="extract-content" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.152563 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="extract-content" Apr 06 13:24:00 crc kubenswrapper[4790]: E0406 13:24:00.152578 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="registry-server" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.152584 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="registry-server" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.152784 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="52874875-7ccd-405d-b522-dc3c4494b2cd" containerName="registry-server" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.153590 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.156528 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.156905 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.159125 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.172727 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591364-t84b2"] Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.312334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mnz\" (UniqueName: \"kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz\") pod \"auto-csr-approver-29591364-t84b2\" (UID: \"e753275f-0532-43c0-8342-93ab2fa65d38\") " pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.414316 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mnz\" (UniqueName: \"kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz\") pod \"auto-csr-approver-29591364-t84b2\" (UID: \"e753275f-0532-43c0-8342-93ab2fa65d38\") " pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.435362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mnz\" (UniqueName: \"kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz\") pod \"auto-csr-approver-29591364-t84b2\" (UID: \"e753275f-0532-43c0-8342-93ab2fa65d38\") " pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.481543 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:00 crc kubenswrapper[4790]: I0406 13:24:00.916364 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591364-t84b2"] Apr 06 13:24:01 crc kubenswrapper[4790]: I0406 13:24:01.305280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591364-t84b2" event={"ID":"e753275f-0532-43c0-8342-93ab2fa65d38","Type":"ContainerStarted","Data":"102b61bf9158fe6e6c6e52266cb8f0ee3370ae8023c846ed412bc25d6d71b5e2"} Apr 06 13:24:02 crc kubenswrapper[4790]: I0406 13:24:02.319833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591364-t84b2" event={"ID":"e753275f-0532-43c0-8342-93ab2fa65d38","Type":"ContainerStarted","Data":"e6113a5b49aed3316527390d226c5284165e57af1c0ed9719e74e9d31ae20f4e"} Apr 06 13:24:02 crc kubenswrapper[4790]: I0406 13:24:02.360629 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591364-t84b2" podStartSLOduration=1.52250622 podStartE2EDuration="2.360604041s" podCreationTimestamp="2026-04-06 13:24:00 +0000 UTC" firstStartedPulling="2026-04-06 13:24:01.047027508 +0000 UTC m=+5220.034770374" lastFinishedPulling="2026-04-06 13:24:01.885125319 +0000 UTC m=+5220.872868195" observedRunningTime="2026-04-06 13:24:02.347252035 +0000 UTC m=+5221.334994901" watchObservedRunningTime="2026-04-06 13:24:02.360604041 +0000 UTC m=+5221.348346907" Apr 06 13:24:03 crc kubenswrapper[4790]: I0406 13:24:03.331097 4790 generic.go:334] "Generic (PLEG): container finished" podID="e753275f-0532-43c0-8342-93ab2fa65d38" containerID="e6113a5b49aed3316527390d226c5284165e57af1c0ed9719e74e9d31ae20f4e" exitCode=0 Apr 06 13:24:03 crc kubenswrapper[4790]: I0406 13:24:03.331322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591364-t84b2" event={"ID":"e753275f-0532-43c0-8342-93ab2fa65d38","Type":"ContainerDied","Data":"e6113a5b49aed3316527390d226c5284165e57af1c0ed9719e74e9d31ae20f4e"} Apr 06 13:24:04 crc kubenswrapper[4790]: I0406 13:24:04.716622 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:04 crc kubenswrapper[4790]: I0406 13:24:04.909977 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mnz\" (UniqueName: \"kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz\") pod \"e753275f-0532-43c0-8342-93ab2fa65d38\" (UID: \"e753275f-0532-43c0-8342-93ab2fa65d38\") " Apr 06 13:24:04 crc kubenswrapper[4790]: I0406 13:24:04.916066 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz" (OuterVolumeSpecName: "kube-api-access-j6mnz") pod "e753275f-0532-43c0-8342-93ab2fa65d38" (UID: "e753275f-0532-43c0-8342-93ab2fa65d38"). InnerVolumeSpecName "kube-api-access-j6mnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.011829 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mnz\" (UniqueName: \"kubernetes.io/projected/e753275f-0532-43c0-8342-93ab2fa65d38-kube-api-access-j6mnz\") on node \"crc\" DevicePath \"\"" Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.353557 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591364-t84b2" event={"ID":"e753275f-0532-43c0-8342-93ab2fa65d38","Type":"ContainerDied","Data":"102b61bf9158fe6e6c6e52266cb8f0ee3370ae8023c846ed412bc25d6d71b5e2"} Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.353598 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102b61bf9158fe6e6c6e52266cb8f0ee3370ae8023c846ed412bc25d6d71b5e2" Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.353654 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591364-t84b2" Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.793723 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591358-t7lc8"] Apr 06 13:24:05 crc kubenswrapper[4790]: I0406 13:24:05.805782 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591358-t7lc8"] Apr 06 13:24:07 crc kubenswrapper[4790]: I0406 13:24:07.686328 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff2494f-96ec-4396-9d68-32db623078e8" path="/var/lib/kubelet/pods/bff2494f-96ec-4396-9d68-32db623078e8/volumes" Apr 06 13:24:18 crc kubenswrapper[4790]: I0406 13:24:18.968598 4790 scope.go:117] "RemoveContainer" containerID="76d4cf2ffef5709a80b033be7f5ff860b8106ddf6abbf8c410d2485a4ed7a710" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.952172 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:24 crc kubenswrapper[4790]: E0406 13:24:24.954303 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e753275f-0532-43c0-8342-93ab2fa65d38" containerName="oc" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.954410 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e753275f-0532-43c0-8342-93ab2fa65d38" containerName="oc" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.954786 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e753275f-0532-43c0-8342-93ab2fa65d38" containerName="oc" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.956606 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.964753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.972582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.972774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:24 crc kubenswrapper[4790]: I0406 13:24:24.972955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qj7\" (UniqueName: \"kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.076464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.076548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.076615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qj7\" (UniqueName: \"kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.091908 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.094350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.127643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qj7\" (UniqueName: \"kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7\") pod \"redhat-marketplace-78w6h\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.284238 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:25 crc kubenswrapper[4790]: I0406 13:24:25.763804 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:26 crc kubenswrapper[4790]: I0406 13:24:26.600468 4790 generic.go:334] "Generic (PLEG): container finished" podID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerID="890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf" exitCode=0 Apr 06 13:24:26 crc kubenswrapper[4790]: I0406 13:24:26.600563 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerDied","Data":"890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf"} Apr 06 13:24:26 crc kubenswrapper[4790]: I0406 13:24:26.600891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerStarted","Data":"01b8b6776b68eda45b4237d93e2cc1e81527821da24a831ad7c0df492b758060"} Apr 06 13:24:28 crc kubenswrapper[4790]: I0406 13:24:28.630364 4790 generic.go:334] "Generic (PLEG): container finished" podID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerID="a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7" exitCode=0 Apr 06 13:24:28 crc kubenswrapper[4790]: I0406 13:24:28.630560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerDied","Data":"a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7"} Apr 06 13:24:29 crc kubenswrapper[4790]: I0406 13:24:29.646909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerStarted","Data":"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60"} Apr 06 13:24:29 crc kubenswrapper[4790]: I0406 13:24:29.682707 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-78w6h" podStartSLOduration=3.279281075 podStartE2EDuration="5.682684988s" podCreationTimestamp="2026-04-06 13:24:24 +0000 UTC" firstStartedPulling="2026-04-06 13:24:26.603342142 +0000 UTC m=+5245.591085028" lastFinishedPulling="2026-04-06 13:24:29.006746075 +0000 UTC m=+5247.994488941" observedRunningTime="2026-04-06 13:24:29.666813364 +0000 UTC m=+5248.654556230" watchObservedRunningTime="2026-04-06 13:24:29.682684988 +0000 UTC m=+5248.670427864" Apr 06 13:24:35 crc kubenswrapper[4790]: I0406 13:24:35.284717 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:35 crc kubenswrapper[4790]: I0406 13:24:35.286259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:35 crc kubenswrapper[4790]: I0406 13:24:35.330423 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:35 crc kubenswrapper[4790]: I0406 13:24:35.774690 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:35 crc kubenswrapper[4790]: I0406 13:24:35.847043 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:37 crc kubenswrapper[4790]: I0406 13:24:37.722331 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-78w6h" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="registry-server" containerID="cri-o://63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60" gracePeriod=2 Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.184681 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.363319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4qj7\" (UniqueName: \"kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7\") pod \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.363358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content\") pod \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.363386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities\") pod \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\" (UID: \"100b9a88-8fa5-464a-aecf-54e8820b0b1d\") " Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.364692 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities" (OuterVolumeSpecName: "utilities") pod "100b9a88-8fa5-464a-aecf-54e8820b0b1d" (UID: "100b9a88-8fa5-464a-aecf-54e8820b0b1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.391862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "100b9a88-8fa5-464a-aecf-54e8820b0b1d" (UID: "100b9a88-8fa5-464a-aecf-54e8820b0b1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.466815 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.466886 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100b9a88-8fa5-464a-aecf-54e8820b0b1d-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.736166 4790 generic.go:334] "Generic (PLEG): container finished" podID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerID="63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60" exitCode=0 Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.736249 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-78w6h" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.736277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerDied","Data":"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60"} Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.736586 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-78w6h" event={"ID":"100b9a88-8fa5-464a-aecf-54e8820b0b1d","Type":"ContainerDied","Data":"01b8b6776b68eda45b4237d93e2cc1e81527821da24a831ad7c0df492b758060"} Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.736608 4790 scope.go:117] "RemoveContainer" containerID="63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.759790 4790 scope.go:117] "RemoveContainer" containerID="a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.944239 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7" (OuterVolumeSpecName: "kube-api-access-h4qj7") pod "100b9a88-8fa5-464a-aecf-54e8820b0b1d" (UID: "100b9a88-8fa5-464a-aecf-54e8820b0b1d"). InnerVolumeSpecName "kube-api-access-h4qj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.960521 4790 scope.go:117] "RemoveContainer" containerID="890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf" Apr 06 13:24:38 crc kubenswrapper[4790]: I0406 13:24:38.977594 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4qj7\" (UniqueName: \"kubernetes.io/projected/100b9a88-8fa5-464a-aecf-54e8820b0b1d-kube-api-access-h4qj7\") on node \"crc\" DevicePath \"\"" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.058745 4790 scope.go:117] "RemoveContainer" containerID="63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60" Apr 06 13:24:39 crc kubenswrapper[4790]: E0406 13:24:39.060060 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60\": container with ID starting with 63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60 not found: ID does not exist" containerID="63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.060196 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60"} err="failed to get container status \"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60\": rpc error: code = NotFound desc = could not find container \"63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60\": container with ID starting with 63f9e24d46d2af8eccda6e0674ac6efa00c5f89d9409afe100b8889f2022ca60 not found: ID does not exist" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.060303 4790 scope.go:117] "RemoveContainer" containerID="a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7" Apr 06 13:24:39 crc kubenswrapper[4790]: E0406 13:24:39.060704 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7\": container with ID starting with a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7 not found: ID does not exist" containerID="a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.060746 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7"} err="failed to get container status \"a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7\": rpc error: code = NotFound desc = could not find container \"a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7\": container with ID starting with a89cb49921ca3c76d14d4fd093cb80487d560f90cca659981286adc0445eebd7 not found: ID does not exist" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.060774 4790 scope.go:117] "RemoveContainer" containerID="890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf" Apr 06 13:24:39 crc kubenswrapper[4790]: E0406 13:24:39.061321 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf\": container with ID starting with 890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf not found: ID does not exist" containerID="890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.061452 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf"} err="failed to get container status \"890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf\": rpc error: code = NotFound desc = could not find container \"890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf\": container with ID starting with 890dfa893221d0ade6dcd0d438afaebd199aced840121aed591b59faad19f5cf not found: ID does not exist" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.118589 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.132126 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-78w6h"] Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.696711 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" path="/var/lib/kubelet/pods/100b9a88-8fa5-464a-aecf-54e8820b0b1d/volumes" Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.753795 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:24:39 crc kubenswrapper[4790]: I0406 13:24:39.753863 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:25:09 crc kubenswrapper[4790]: I0406 13:25:09.753055 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:25:09 crc kubenswrapper[4790]: I0406 13:25:09.754431 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:25:39 crc kubenswrapper[4790]: I0406 13:25:39.753150 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:25:39 crc kubenswrapper[4790]: I0406 13:25:39.754784 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:25:39 crc kubenswrapper[4790]: I0406 13:25:39.754987 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:25:39 crc kubenswrapper[4790]: I0406 13:25:39.755935 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:25:39 crc kubenswrapper[4790]: I0406 13:25:39.756075 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" gracePeriod=600 Apr 06 13:25:39 crc kubenswrapper[4790]: E0406 13:25:39.880167 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:25:40 crc kubenswrapper[4790]: I0406 13:25:40.378760 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" exitCode=0 Apr 06 13:25:40 crc kubenswrapper[4790]: I0406 13:25:40.378810 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200"} Apr 06 13:25:40 crc kubenswrapper[4790]: I0406 13:25:40.378882 4790 scope.go:117] "RemoveContainer" containerID="720d9e1a3241104692f4a3dbe53675afaae355d3fe7f18fc2fc569744fe32aef" Apr 06 13:25:40 crc kubenswrapper[4790]: I0406 13:25:40.379428 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:25:40 crc kubenswrapper[4790]: E0406 13:25:40.379744 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:25:53 crc kubenswrapper[4790]: I0406 13:25:53.681935 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:25:53 crc kubenswrapper[4790]: E0406 13:25:53.682677 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.158522 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591366-jmb9d"] Apr 06 13:26:00 crc kubenswrapper[4790]: E0406 13:26:00.159540 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="registry-server" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.159557 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="registry-server" Apr 06 13:26:00 crc kubenswrapper[4790]: E0406 13:26:00.159577 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="extract-content" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.159586 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="extract-content" Apr 06 13:26:00 crc kubenswrapper[4790]: E0406 13:26:00.159622 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="extract-utilities" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.159632 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="extract-utilities" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.159884 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="100b9a88-8fa5-464a-aecf-54e8820b0b1d" containerName="registry-server" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.160544 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.163010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.163031 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.163512 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.171435 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591366-jmb9d"] Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.256395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzr7\" (UniqueName: \"kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7\") pod \"auto-csr-approver-29591366-jmb9d\" (UID: \"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5\") " pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.358146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzr7\" (UniqueName: \"kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7\") pod \"auto-csr-approver-29591366-jmb9d\" (UID: \"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5\") " pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.386756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzr7\" (UniqueName: \"kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7\") pod \"auto-csr-approver-29591366-jmb9d\" (UID: \"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5\") " pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.483080 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:00 crc kubenswrapper[4790]: I0406 13:26:00.941300 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591366-jmb9d"] Apr 06 13:26:01 crc kubenswrapper[4790]: I0406 13:26:01.579901 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" event={"ID":"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5","Type":"ContainerStarted","Data":"d19684c9438e9e75adba1d713a1cad6d54aa03b5a315cd2ac59e5af7bf26e5cf"} Apr 06 13:26:03 crc kubenswrapper[4790]: I0406 13:26:03.602244 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" containerID="56ea50d8a5ce16b250f0d82840d50c865ad2d96990bc56e260b61f1885122e03" exitCode=0 Apr 06 13:26:03 crc kubenswrapper[4790]: I0406 13:26:03.602323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" event={"ID":"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5","Type":"ContainerDied","Data":"56ea50d8a5ce16b250f0d82840d50c865ad2d96990bc56e260b61f1885122e03"} Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.049430 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.156457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htzr7\" (UniqueName: \"kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7\") pod \"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5\" (UID: \"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5\") " Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.174542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7" (OuterVolumeSpecName: "kube-api-access-htzr7") pod "4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" (UID: "4b60f78b-9dba-4317-991d-9fb9f6c9c5a5"). InnerVolumeSpecName "kube-api-access-htzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.259169 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htzr7\" (UniqueName: \"kubernetes.io/projected/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5-kube-api-access-htzr7\") on node \"crc\" DevicePath \"\"" Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.622059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" event={"ID":"4b60f78b-9dba-4317-991d-9fb9f6c9c5a5","Type":"ContainerDied","Data":"d19684c9438e9e75adba1d713a1cad6d54aa03b5a315cd2ac59e5af7bf26e5cf"} Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.622105 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19684c9438e9e75adba1d713a1cad6d54aa03b5a315cd2ac59e5af7bf26e5cf" Apr 06 13:26:05 crc kubenswrapper[4790]: I0406 13:26:05.622439 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591366-jmb9d" Apr 06 13:26:06 crc kubenswrapper[4790]: I0406 13:26:06.123646 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591360-qs6lq"] Apr 06 13:26:06 crc kubenswrapper[4790]: I0406 13:26:06.146119 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591360-qs6lq"] Apr 06 13:26:07 crc kubenswrapper[4790]: I0406 13:26:07.676423 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:26:07 crc kubenswrapper[4790]: E0406 13:26:07.677014 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:26:07 crc kubenswrapper[4790]: I0406 13:26:07.689332 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33de3a88-1fc3-4824-a7e7-e3f7019ac216" path="/var/lib/kubelet/pods/33de3a88-1fc3-4824-a7e7-e3f7019ac216/volumes" Apr 06 13:26:19 crc kubenswrapper[4790]: I0406 13:26:19.104477 4790 scope.go:117] "RemoveContainer" containerID="fc0475e7dceaf3f821110ae00fa5833c8230550ad28b77c217c5fe0b6a93aedd" Apr 06 13:26:22 crc kubenswrapper[4790]: I0406 13:26:22.677595 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:26:22 crc kubenswrapper[4790]: E0406 13:26:22.679005 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:26:33 crc kubenswrapper[4790]: I0406 13:26:33.677432 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:26:33 crc kubenswrapper[4790]: E0406 13:26:33.678149 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:26:44 crc kubenswrapper[4790]: I0406 13:26:44.675345 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:26:44 crc kubenswrapper[4790]: E0406 13:26:44.676526 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:26:57 crc kubenswrapper[4790]: I0406 13:26:57.675964 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:26:57 crc kubenswrapper[4790]: E0406 13:26:57.677986 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:27:10 crc kubenswrapper[4790]: I0406 13:27:10.675754 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:27:10 crc kubenswrapper[4790]: E0406 13:27:10.676492 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:27:25 crc kubenswrapper[4790]: I0406 13:27:25.675840 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:27:25 crc kubenswrapper[4790]: E0406 13:27:25.676699 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:27:36 crc kubenswrapper[4790]: I0406 13:27:36.676649 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:27:36 crc kubenswrapper[4790]: E0406 13:27:36.677374 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:27:51 crc kubenswrapper[4790]: I0406 13:27:51.682473 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:27:51 crc kubenswrapper[4790]: E0406 13:27:51.683280 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.143028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591368-7jnfp"] Apr 06 13:28:00 crc kubenswrapper[4790]: E0406 13:28:00.144094 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" containerName="oc" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.144170 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" containerName="oc" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.144368 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" containerName="oc" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.145096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.147454 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.147794 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.148989 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.151385 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591368-7jnfp"] Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.199138 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq798\" (UniqueName: \"kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798\") pod \"auto-csr-approver-29591368-7jnfp\" (UID: \"3a568713-745e-4682-9f81-c00c827bdb56\") " pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.301077 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq798\" (UniqueName: \"kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798\") pod \"auto-csr-approver-29591368-7jnfp\" (UID: \"3a568713-745e-4682-9f81-c00c827bdb56\") " pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.319377 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq798\" (UniqueName: \"kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798\") pod \"auto-csr-approver-29591368-7jnfp\" (UID: \"3a568713-745e-4682-9f81-c00c827bdb56\") " pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.465196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:00 crc kubenswrapper[4790]: I0406 13:28:00.925121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591368-7jnfp"] Apr 06 13:28:01 crc kubenswrapper[4790]: I0406 13:28:01.776625 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" event={"ID":"3a568713-745e-4682-9f81-c00c827bdb56","Type":"ContainerStarted","Data":"3f32d0d523f41f39efe9ce76ce3f95b2e184223ce180431915e873599c9043bc"} Apr 06 13:28:02 crc kubenswrapper[4790]: I0406 13:28:02.676455 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:28:02 crc kubenswrapper[4790]: E0406 13:28:02.677292 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:28:02 crc kubenswrapper[4790]: I0406 13:28:02.787140 4790 generic.go:334] "Generic (PLEG): container finished" podID="3a568713-745e-4682-9f81-c00c827bdb56" containerID="57fe435b53f4156ba6ae3bbffc8b50b940c5bf26a317d7e690acbcf01998df4b" exitCode=0 Apr 06 13:28:02 crc kubenswrapper[4790]: I0406 13:28:02.787242 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" event={"ID":"3a568713-745e-4682-9f81-c00c827bdb56","Type":"ContainerDied","Data":"57fe435b53f4156ba6ae3bbffc8b50b940c5bf26a317d7e690acbcf01998df4b"} Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.377437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.485479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq798\" (UniqueName: \"kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798\") pod \"3a568713-745e-4682-9f81-c00c827bdb56\" (UID: \"3a568713-745e-4682-9f81-c00c827bdb56\") " Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.493421 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798" (OuterVolumeSpecName: "kube-api-access-pq798") pod "3a568713-745e-4682-9f81-c00c827bdb56" (UID: "3a568713-745e-4682-9f81-c00c827bdb56"). InnerVolumeSpecName "kube-api-access-pq798". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.588978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq798\" (UniqueName: \"kubernetes.io/projected/3a568713-745e-4682-9f81-c00c827bdb56-kube-api-access-pq798\") on node \"crc\" DevicePath \"\"" Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.814364 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" event={"ID":"3a568713-745e-4682-9f81-c00c827bdb56","Type":"ContainerDied","Data":"3f32d0d523f41f39efe9ce76ce3f95b2e184223ce180431915e873599c9043bc"} Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.814424 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591368-7jnfp" Apr 06 13:28:04 crc kubenswrapper[4790]: I0406 13:28:04.814416 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f32d0d523f41f39efe9ce76ce3f95b2e184223ce180431915e873599c9043bc" Apr 06 13:28:05 crc kubenswrapper[4790]: I0406 13:28:05.465746 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591362-8bc7c"] Apr 06 13:28:05 crc kubenswrapper[4790]: I0406 13:28:05.477152 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591362-8bc7c"] Apr 06 13:28:05 crc kubenswrapper[4790]: I0406 13:28:05.689462 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b3d820-e63f-4c97-98df-f9f597180d92" path="/var/lib/kubelet/pods/e9b3d820-e63f-4c97-98df-f9f597180d92/volumes" Apr 06 13:28:17 crc kubenswrapper[4790]: I0406 13:28:17.675544 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:28:17 crc kubenswrapper[4790]: E0406 13:28:17.676695 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:28:19 crc kubenswrapper[4790]: I0406 13:28:19.216028 4790 scope.go:117] "RemoveContainer" containerID="e56ab918aac3c5b2a1e08e7bbc4b90bc9f2db0149b3bdd65d01608ea3bbfe146" Apr 06 13:28:28 crc kubenswrapper[4790]: I0406 13:28:28.676265 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:28:28 crc kubenswrapper[4790]: E0406 13:28:28.677076 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.461628 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:29 crc kubenswrapper[4790]: E0406 13:28:29.462587 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a568713-745e-4682-9f81-c00c827bdb56" containerName="oc" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.462609 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a568713-745e-4682-9f81-c00c827bdb56" containerName="oc" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.462926 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a568713-745e-4682-9f81-c00c827bdb56" containerName="oc" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.465221 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.479464 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.562572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2btd\" (UniqueName: \"kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.562992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.563245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.665086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2btd\" (UniqueName: \"kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.665147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.665228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.665847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.666032 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.686150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2btd\" (UniqueName: \"kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd\") pod \"community-operators-8vw42\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:29 crc kubenswrapper[4790]: I0406 13:28:29.792043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:30 crc kubenswrapper[4790]: I0406 13:28:30.365736 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:31 crc kubenswrapper[4790]: I0406 13:28:31.063318 4790 generic.go:334] "Generic (PLEG): container finished" podID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerID="e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd" exitCode=0 Apr 06 13:28:31 crc kubenswrapper[4790]: I0406 13:28:31.063358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerDied","Data":"e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd"} Apr 06 13:28:31 crc kubenswrapper[4790]: I0406 13:28:31.063384 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerStarted","Data":"160ca6a9e26ffba109214a71b82f8bf681fe0972bb97cadae026602f8782a440"} Apr 06 13:28:31 crc kubenswrapper[4790]: I0406 13:28:31.065109 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:28:33 crc kubenswrapper[4790]: I0406 13:28:33.084246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerStarted","Data":"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab"} Apr 06 13:28:34 crc kubenswrapper[4790]: I0406 13:28:34.096637 4790 generic.go:334] "Generic (PLEG): container finished" podID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerID="dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab" exitCode=0 Apr 06 13:28:34 crc kubenswrapper[4790]: I0406 13:28:34.096704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerDied","Data":"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab"} Apr 06 13:28:35 crc kubenswrapper[4790]: I0406 13:28:35.108888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerStarted","Data":"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2"} Apr 06 13:28:35 crc kubenswrapper[4790]: I0406 13:28:35.142221 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vw42" podStartSLOduration=2.692212581 podStartE2EDuration="6.14219306s" podCreationTimestamp="2026-04-06 13:28:29 +0000 UTC" firstStartedPulling="2026-04-06 13:28:31.064893796 +0000 UTC m=+5490.052636662" lastFinishedPulling="2026-04-06 13:28:34.514874275 +0000 UTC m=+5493.502617141" observedRunningTime="2026-04-06 13:28:35.126924992 +0000 UTC m=+5494.114667858" watchObservedRunningTime="2026-04-06 13:28:35.14219306 +0000 UTC m=+5494.129935926" Apr 06 13:28:39 crc kubenswrapper[4790]: I0406 13:28:39.793256 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:39 crc kubenswrapper[4790]: I0406 13:28:39.793783 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:39 crc kubenswrapper[4790]: I0406 13:28:39.843966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:40 crc kubenswrapper[4790]: I0406 13:28:40.208051 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:40 crc kubenswrapper[4790]: I0406 13:28:40.258324 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.169569 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vw42" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="registry-server" containerID="cri-o://28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2" gracePeriod=2 Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.675220 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:28:42 crc kubenswrapper[4790]: E0406 13:28:42.675690 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.714663 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.864052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2btd\" (UniqueName: \"kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd\") pod \"60c9ac86-7308-459e-a377-f1fc2d7fe225\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.864113 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities\") pod \"60c9ac86-7308-459e-a377-f1fc2d7fe225\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.864280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content\") pod \"60c9ac86-7308-459e-a377-f1fc2d7fe225\" (UID: \"60c9ac86-7308-459e-a377-f1fc2d7fe225\") " Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.865284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities" (OuterVolumeSpecName: "utilities") pod "60c9ac86-7308-459e-a377-f1fc2d7fe225" (UID: "60c9ac86-7308-459e-a377-f1fc2d7fe225"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.876079 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd" (OuterVolumeSpecName: "kube-api-access-v2btd") pod "60c9ac86-7308-459e-a377-f1fc2d7fe225" (UID: "60c9ac86-7308-459e-a377-f1fc2d7fe225"). InnerVolumeSpecName "kube-api-access-v2btd". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.914142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60c9ac86-7308-459e-a377-f1fc2d7fe225" (UID: "60c9ac86-7308-459e-a377-f1fc2d7fe225"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.966905 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2btd\" (UniqueName: \"kubernetes.io/projected/60c9ac86-7308-459e-a377-f1fc2d7fe225-kube-api-access-v2btd\") on node \"crc\" DevicePath \"\"" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.966949 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:28:42 crc kubenswrapper[4790]: I0406 13:28:42.966962 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c9ac86-7308-459e-a377-f1fc2d7fe225-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.205745 4790 generic.go:334] "Generic (PLEG): container finished" podID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerID="28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2" exitCode=0 Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.205902 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vw42" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.205888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerDied","Data":"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2"} Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.206208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vw42" event={"ID":"60c9ac86-7308-459e-a377-f1fc2d7fe225","Type":"ContainerDied","Data":"160ca6a9e26ffba109214a71b82f8bf681fe0972bb97cadae026602f8782a440"} Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.206238 4790 scope.go:117] "RemoveContainer" containerID="28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.232152 4790 scope.go:117] "RemoveContainer" containerID="dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.251682 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.271517 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vw42"] Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.285533 4790 scope.go:117] "RemoveContainer" containerID="e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.326997 4790 scope.go:117] "RemoveContainer" containerID="28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2" Apr 06 13:28:43 crc kubenswrapper[4790]: E0406 13:28:43.328814 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2\": container with ID starting with 28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2 not found: ID does not exist" containerID="28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.328920 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2"} err="failed to get container status \"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2\": rpc error: code = NotFound desc = could not find container \"28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2\": container with ID starting with 28613994750fa9875a2cc1af2990b4683e5cee209d80a04115f4f6e586b48cb2 not found: ID does not exist" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.329013 4790 scope.go:117] "RemoveContainer" containerID="dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab" Apr 06 13:28:43 crc kubenswrapper[4790]: E0406 13:28:43.331221 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab\": container with ID starting with dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab not found: ID does not exist" containerID="dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.331347 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab"} err="failed to get container status \"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab\": rpc error: code = NotFound desc = could not find container \"dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab\": container with ID starting with dc48390cede4a0da21afef07d32ad0ad0539d53266397e36f9b0dded9680bdab not found: ID does not exist" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.331432 4790 scope.go:117] "RemoveContainer" containerID="e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd" Apr 06 13:28:43 crc kubenswrapper[4790]: E0406 13:28:43.331938 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd\": container with ID starting with e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd not found: ID does not exist" containerID="e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.332015 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd"} err="failed to get container status \"e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd\": rpc error: code = NotFound desc = could not find container \"e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd\": container with ID starting with e03e43d39d378572ae2c69225a4471da3fb6debb388a6f718a01b1455d00bebd not found: ID does not exist" Apr 06 13:28:43 crc kubenswrapper[4790]: I0406 13:28:43.687129 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" path="/var/lib/kubelet/pods/60c9ac86-7308-459e-a377-f1fc2d7fe225/volumes" Apr 06 13:28:53 crc kubenswrapper[4790]: I0406 13:28:53.677078 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:28:53 crc kubenswrapper[4790]: E0406 13:28:53.678459 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:29:04 crc kubenswrapper[4790]: I0406 13:29:04.675278 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:29:04 crc kubenswrapper[4790]: E0406 13:29:04.676063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:29:17 crc kubenswrapper[4790]: I0406 13:29:17.676634 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:29:17 crc kubenswrapper[4790]: E0406 13:29:17.677792 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:29:28 crc kubenswrapper[4790]: I0406 13:29:28.675877 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:29:28 crc kubenswrapper[4790]: E0406 13:29:28.676782 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:29:40 crc kubenswrapper[4790]: I0406 13:29:40.675722 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:29:40 crc kubenswrapper[4790]: E0406 13:29:40.676594 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:29:55 crc kubenswrapper[4790]: I0406 13:29:55.675647 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:29:55 crc kubenswrapper[4790]: E0406 13:29:55.676566 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.146130 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591370-9lb78"] Apr 06 13:30:00 crc kubenswrapper[4790]: E0406 13:30:00.147200 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="extract-content" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.147216 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="extract-content" Apr 06 13:30:00 crc kubenswrapper[4790]: E0406 13:30:00.147230 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="extract-utilities" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.147236 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="extract-utilities" Apr 06 13:30:00 crc kubenswrapper[4790]: E0406 13:30:00.147268 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="registry-server" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.147274 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="registry-server" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.147464 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c9ac86-7308-459e-a377-f1fc2d7fe225" containerName="registry-server" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.148303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.150675 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.150984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.151293 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.159576 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc"] Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.164444 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.166568 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.167071 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.177647 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591370-9lb78"] Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.194514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc"] Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.251405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.251488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4z6c\" (UniqueName: \"kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c\") pod \"auto-csr-approver-29591370-9lb78\" (UID: \"8c97babb-c360-4264-961a-015b42cf709e\") " pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.251567 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.251595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmzj\" (UniqueName: \"kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.353608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4z6c\" (UniqueName: \"kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c\") pod \"auto-csr-approver-29591370-9lb78\" (UID: \"8c97babb-c360-4264-961a-015b42cf709e\") " pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.354222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.354329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmzj\" (UniqueName: \"kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.354475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.355332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.367901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.385938 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmzj\" (UniqueName: \"kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj\") pod \"collect-profiles-29591370-r8tfc\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.390478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4z6c\" (UniqueName: \"kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c\") pod \"auto-csr-approver-29591370-9lb78\" (UID: \"8c97babb-c360-4264-961a-015b42cf709e\") " pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.471488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:00 crc kubenswrapper[4790]: I0406 13:30:00.489117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:01 crc kubenswrapper[4790]: I0406 13:30:01.005693 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591370-9lb78"] Apr 06 13:30:01 crc kubenswrapper[4790]: I0406 13:30:01.015182 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc"] Apr 06 13:30:01 crc kubenswrapper[4790]: W0406 13:30:01.152415 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod574444b3_3265_4fdb_993e_d2630ea33b1d.slice/crio-161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883 WatchSource:0}: Error finding container 161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883: Status 404 returned error can't find the container with id 161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883 Apr 06 13:30:02 crc kubenswrapper[4790]: I0406 13:30:02.009677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591370-9lb78" event={"ID":"8c97babb-c360-4264-961a-015b42cf709e","Type":"ContainerStarted","Data":"00777586fc98acd162716404e6171852962ca1ada146defc1b8dcafe73b3c339"} Apr 06 13:30:02 crc kubenswrapper[4790]: I0406 13:30:02.011272 4790 generic.go:334] "Generic (PLEG): container finished" podID="574444b3-3265-4fdb-993e-d2630ea33b1d" containerID="4101b8a72c6a7702cef56d5242cea865c99f27a64affb1f4b721cbc1f7928fef" exitCode=0 Apr 06 13:30:02 crc kubenswrapper[4790]: I0406 13:30:02.011308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" event={"ID":"574444b3-3265-4fdb-993e-d2630ea33b1d","Type":"ContainerDied","Data":"4101b8a72c6a7702cef56d5242cea865c99f27a64affb1f4b721cbc1f7928fef"} Apr 06 13:30:02 crc kubenswrapper[4790]: I0406 13:30:02.011327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" event={"ID":"574444b3-3265-4fdb-993e-d2630ea33b1d","Type":"ContainerStarted","Data":"161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883"} Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.022674 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c97babb-c360-4264-961a-015b42cf709e" containerID="623dfdb792d4a3d02db65c8b80765322f29461a90bb6d3420936b95c22f52c69" exitCode=0 Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.022717 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591370-9lb78" event={"ID":"8c97babb-c360-4264-961a-015b42cf709e","Type":"ContainerDied","Data":"623dfdb792d4a3d02db65c8b80765322f29461a90bb6d3420936b95c22f52c69"} Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.403243 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.527971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmzj\" (UniqueName: \"kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj\") pod \"574444b3-3265-4fdb-993e-d2630ea33b1d\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.528127 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume\") pod \"574444b3-3265-4fdb-993e-d2630ea33b1d\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.528271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume\") pod \"574444b3-3265-4fdb-993e-d2630ea33b1d\" (UID: \"574444b3-3265-4fdb-993e-d2630ea33b1d\") " Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.529140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume" (OuterVolumeSpecName: "config-volume") pod "574444b3-3265-4fdb-993e-d2630ea33b1d" (UID: "574444b3-3265-4fdb-993e-d2630ea33b1d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.534688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "574444b3-3265-4fdb-993e-d2630ea33b1d" (UID: "574444b3-3265-4fdb-993e-d2630ea33b1d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.534980 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj" (OuterVolumeSpecName: "kube-api-access-6vmzj") pod "574444b3-3265-4fdb-993e-d2630ea33b1d" (UID: "574444b3-3265-4fdb-993e-d2630ea33b1d"). InnerVolumeSpecName "kube-api-access-6vmzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.631134 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/574444b3-3265-4fdb-993e-d2630ea33b1d-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.631178 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmzj\" (UniqueName: \"kubernetes.io/projected/574444b3-3265-4fdb-993e-d2630ea33b1d-kube-api-access-6vmzj\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:03 crc kubenswrapper[4790]: I0406 13:30:03.631189 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/574444b3-3265-4fdb-993e-d2630ea33b1d-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.034610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" event={"ID":"574444b3-3265-4fdb-993e-d2630ea33b1d","Type":"ContainerDied","Data":"161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883"} Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.034905 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="161b0dfe8bd2b8f9c815582f852c2a1a576424aed5f09dbcff1f12fe3e1d0883" Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.034623 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591370-r8tfc" Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.345187 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.454524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4z6c\" (UniqueName: \"kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c\") pod \"8c97babb-c360-4264-961a-015b42cf709e\" (UID: \"8c97babb-c360-4264-961a-015b42cf709e\") " Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.469272 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c" (OuterVolumeSpecName: "kube-api-access-l4z6c") pod "8c97babb-c360-4264-961a-015b42cf709e" (UID: "8c97babb-c360-4264-961a-015b42cf709e"). InnerVolumeSpecName "kube-api-access-l4z6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.483331 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6"] Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.495266 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591325-89jp6"] Apr 06 13:30:04 crc kubenswrapper[4790]: I0406 13:30:04.558052 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4z6c\" (UniqueName: \"kubernetes.io/projected/8c97babb-c360-4264-961a-015b42cf709e-kube-api-access-l4z6c\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.045195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591370-9lb78" event={"ID":"8c97babb-c360-4264-961a-015b42cf709e","Type":"ContainerDied","Data":"00777586fc98acd162716404e6171852962ca1ada146defc1b8dcafe73b3c339"} Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.045232 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00777586fc98acd162716404e6171852962ca1ada146defc1b8dcafe73b3c339" Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.045255 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591370-9lb78" Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.406220 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591364-t84b2"] Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.414353 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591364-t84b2"] Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.687447 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734777b7-31b8-4736-b3cc-d322f5c3a3dc" path="/var/lib/kubelet/pods/734777b7-31b8-4736-b3cc-d322f5c3a3dc/volumes" Apr 06 13:30:05 crc kubenswrapper[4790]: I0406 13:30:05.688145 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e753275f-0532-43c0-8342-93ab2fa65d38" path="/var/lib/kubelet/pods/e753275f-0532-43c0-8342-93ab2fa65d38/volumes" Apr 06 13:30:10 crc kubenswrapper[4790]: I0406 13:30:10.675292 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:30:10 crc kubenswrapper[4790]: E0406 13:30:10.676116 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.371713 4790 scope.go:117] "RemoveContainer" containerID="e12a4003a0575b5ffbbe7c52c45509c93fa8dfec4e12db787087603be708c1c4" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.421366 4790 scope.go:117] "RemoveContainer" containerID="e6113a5b49aed3316527390d226c5284165e57af1c0ed9719e74e9d31ae20f4e" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.640495 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:19 crc kubenswrapper[4790]: E0406 13:30:19.641361 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574444b3-3265-4fdb-993e-d2630ea33b1d" containerName="collect-profiles" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.641387 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="574444b3-3265-4fdb-993e-d2630ea33b1d" containerName="collect-profiles" Apr 06 13:30:19 crc kubenswrapper[4790]: E0406 13:30:19.641426 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c97babb-c360-4264-961a-015b42cf709e" containerName="oc" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.641434 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c97babb-c360-4264-961a-015b42cf709e" containerName="oc" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.641662 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="574444b3-3265-4fdb-993e-d2630ea33b1d" containerName="collect-profiles" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.641698 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c97babb-c360-4264-961a-015b42cf709e" containerName="oc" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.643528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.673983 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.763751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwcfr\" (UniqueName: \"kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.763927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.763961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.865458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.865523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.865657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwcfr\" (UniqueName: \"kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.866068 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.866132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.892932 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwcfr\" (UniqueName: \"kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr\") pod \"redhat-operators-2r8dm\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:19 crc kubenswrapper[4790]: I0406 13:30:19.978520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:20 crc kubenswrapper[4790]: I0406 13:30:20.417989 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:20 crc kubenswrapper[4790]: W0406 13:30:20.432988 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d474fb_ac2a_464f_ac12_36d3ef1f9529.slice/crio-36b2ac5b90249ed4dede3a640b6acb4559ac0c2aea7bb82ea3317b6da8ead8f7 WatchSource:0}: Error finding container 36b2ac5b90249ed4dede3a640b6acb4559ac0c2aea7bb82ea3317b6da8ead8f7: Status 404 returned error can't find the container with id 36b2ac5b90249ed4dede3a640b6acb4559ac0c2aea7bb82ea3317b6da8ead8f7 Apr 06 13:30:21 crc kubenswrapper[4790]: I0406 13:30:21.227711 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerID="2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1" exitCode=0 Apr 06 13:30:21 crc kubenswrapper[4790]: I0406 13:30:21.227854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerDied","Data":"2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1"} Apr 06 13:30:21 crc kubenswrapper[4790]: I0406 13:30:21.228236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerStarted","Data":"36b2ac5b90249ed4dede3a640b6acb4559ac0c2aea7bb82ea3317b6da8ead8f7"} Apr 06 13:30:23 crc kubenswrapper[4790]: I0406 13:30:23.247622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerStarted","Data":"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665"} Apr 06 13:30:25 crc kubenswrapper[4790]: I0406 13:30:25.675892 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:30:25 crc kubenswrapper[4790]: E0406 13:30:25.676627 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:30:27 crc kubenswrapper[4790]: I0406 13:30:27.288291 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerID="a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665" exitCode=0 Apr 06 13:30:27 crc kubenswrapper[4790]: I0406 13:30:27.288371 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerDied","Data":"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665"} Apr 06 13:30:28 crc kubenswrapper[4790]: I0406 13:30:28.310900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerStarted","Data":"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f"} Apr 06 13:30:28 crc kubenswrapper[4790]: I0406 13:30:28.354494 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2r8dm" podStartSLOduration=2.948228319 podStartE2EDuration="9.354446008s" podCreationTimestamp="2026-04-06 13:30:19 +0000 UTC" firstStartedPulling="2026-04-06 13:30:21.232748221 +0000 UTC m=+5600.220491097" lastFinishedPulling="2026-04-06 13:30:27.63896592 +0000 UTC m=+5606.626708786" observedRunningTime="2026-04-06 13:30:28.33316865 +0000 UTC m=+5607.320911536" watchObservedRunningTime="2026-04-06 13:30:28.354446008 +0000 UTC m=+5607.342188874" Apr 06 13:30:29 crc kubenswrapper[4790]: I0406 13:30:29.979321 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:29 crc kubenswrapper[4790]: I0406 13:30:29.979753 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:31 crc kubenswrapper[4790]: I0406 13:30:31.103796 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2r8dm" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="registry-server" probeResult="failure" output=< Apr 06 13:30:31 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:30:31 crc kubenswrapper[4790]: > Apr 06 13:30:36 crc kubenswrapper[4790]: I0406 13:30:36.675882 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:30:36 crc kubenswrapper[4790]: E0406 13:30:36.676646 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:30:40 crc kubenswrapper[4790]: I0406 13:30:40.024128 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:40 crc kubenswrapper[4790]: I0406 13:30:40.072479 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:40 crc kubenswrapper[4790]: I0406 13:30:40.283655 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.429360 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2r8dm" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="registry-server" containerID="cri-o://0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f" gracePeriod=2 Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.897977 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.925936 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwcfr\" (UniqueName: \"kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr\") pod \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.926155 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content\") pod \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.926207 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities\") pod \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\" (UID: \"c9d474fb-ac2a-464f-ac12-36d3ef1f9529\") " Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.927509 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities" (OuterVolumeSpecName: "utilities") pod "c9d474fb-ac2a-464f-ac12-36d3ef1f9529" (UID: "c9d474fb-ac2a-464f-ac12-36d3ef1f9529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:30:41 crc kubenswrapper[4790]: I0406 13:30:41.969445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr" (OuterVolumeSpecName: "kube-api-access-mwcfr") pod "c9d474fb-ac2a-464f-ac12-36d3ef1f9529" (UID: "c9d474fb-ac2a-464f-ac12-36d3ef1f9529"). InnerVolumeSpecName "kube-api-access-mwcfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.028054 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwcfr\" (UniqueName: \"kubernetes.io/projected/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-kube-api-access-mwcfr\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.028087 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.073449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9d474fb-ac2a-464f-ac12-36d3ef1f9529" (UID: "c9d474fb-ac2a-464f-ac12-36d3ef1f9529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.132400 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d474fb-ac2a-464f-ac12-36d3ef1f9529-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.440354 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerID="0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f" exitCode=0 Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.440431 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2r8dm" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.440444 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerDied","Data":"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f"} Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.440882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2r8dm" event={"ID":"c9d474fb-ac2a-464f-ac12-36d3ef1f9529","Type":"ContainerDied","Data":"36b2ac5b90249ed4dede3a640b6acb4559ac0c2aea7bb82ea3317b6da8ead8f7"} Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.440910 4790 scope.go:117] "RemoveContainer" containerID="0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.465305 4790 scope.go:117] "RemoveContainer" containerID="a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.483199 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.492460 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2r8dm"] Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.503986 4790 scope.go:117] "RemoveContainer" containerID="2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.562634 4790 scope.go:117] "RemoveContainer" containerID="0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f" Apr 06 13:30:42 crc kubenswrapper[4790]: E0406 13:30:42.563531 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f\": container with ID starting with 0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f not found: ID does not exist" containerID="0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.563569 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f"} err="failed to get container status \"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f\": rpc error: code = NotFound desc = could not find container \"0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f\": container with ID starting with 0eae90a338bf8dcb0957ba579e2ab769a3c925cefa06b5290fdbfb6caf7e381f not found: ID does not exist" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.563592 4790 scope.go:117] "RemoveContainer" containerID="a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665" Apr 06 13:30:42 crc kubenswrapper[4790]: E0406 13:30:42.564298 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665\": container with ID starting with a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665 not found: ID does not exist" containerID="a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.564320 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665"} err="failed to get container status \"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665\": rpc error: code = NotFound desc = could not find container \"a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665\": container with ID starting with a0ea2b165992cdf15f38149109f50d983d1b1dbe10fd46bb49871937dbfac665 not found: ID does not exist" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.564334 4790 scope.go:117] "RemoveContainer" containerID="2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1" Apr 06 13:30:42 crc kubenswrapper[4790]: E0406 13:30:42.564627 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1\": container with ID starting with 2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1 not found: ID does not exist" containerID="2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1" Apr 06 13:30:42 crc kubenswrapper[4790]: I0406 13:30:42.564645 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1"} err="failed to get container status \"2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1\": rpc error: code = NotFound desc = could not find container \"2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1\": container with ID starting with 2f0b30c6703ff4aa2f347a670a73f503331239537a9beb6d33a73f116a4c9ae1 not found: ID does not exist" Apr 06 13:30:43 crc kubenswrapper[4790]: I0406 13:30:43.688895 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" path="/var/lib/kubelet/pods/c9d474fb-ac2a-464f-ac12-36d3ef1f9529/volumes" Apr 06 13:30:49 crc kubenswrapper[4790]: I0406 13:30:49.675919 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:30:50 crc kubenswrapper[4790]: I0406 13:30:50.523522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460"} Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.140306 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591372-xgz22"] Apr 06 13:32:00 crc kubenswrapper[4790]: E0406 13:32:00.141343 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="registry-server" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.141359 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="registry-server" Apr 06 13:32:00 crc kubenswrapper[4790]: E0406 13:32:00.141378 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="extract-content" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.141386 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="extract-content" Apr 06 13:32:00 crc kubenswrapper[4790]: E0406 13:32:00.141415 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="extract-utilities" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.141424 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="extract-utilities" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.141689 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d474fb-ac2a-464f-ac12-36d3ef1f9529" containerName="registry-server" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.142414 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.145060 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.145740 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.145746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.149548 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591372-xgz22"] Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.230482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4z8k\" (UniqueName: \"kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k\") pod \"auto-csr-approver-29591372-xgz22\" (UID: \"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72\") " pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.333111 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4z8k\" (UniqueName: \"kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k\") pod \"auto-csr-approver-29591372-xgz22\" (UID: \"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72\") " pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.370983 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4z8k\" (UniqueName: \"kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k\") pod \"auto-csr-approver-29591372-xgz22\" (UID: \"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72\") " pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.460542 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:00 crc kubenswrapper[4790]: I0406 13:32:00.895210 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591372-xgz22"] Apr 06 13:32:01 crc kubenswrapper[4790]: I0406 13:32:01.228521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591372-xgz22" event={"ID":"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72","Type":"ContainerStarted","Data":"9f2803a4b6d6d679b74bc7ea39f0fed4087d0533957af88ef17a15bf7878ac6c"} Apr 06 13:32:02 crc kubenswrapper[4790]: I0406 13:32:02.237497 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" containerID="f398cc8a4a728a6a75abc400cf9865ee65b7670542e2f84a48e63582723f6ad4" exitCode=0 Apr 06 13:32:02 crc kubenswrapper[4790]: I0406 13:32:02.237550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591372-xgz22" event={"ID":"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72","Type":"ContainerDied","Data":"f398cc8a4a728a6a75abc400cf9865ee65b7670542e2f84a48e63582723f6ad4"} Apr 06 13:32:03 crc kubenswrapper[4790]: I0406 13:32:03.619615 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:03 crc kubenswrapper[4790]: I0406 13:32:03.701645 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4z8k\" (UniqueName: \"kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k\") pod \"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72\" (UID: \"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72\") " Apr 06 13:32:03 crc kubenswrapper[4790]: I0406 13:32:03.708528 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k" (OuterVolumeSpecName: "kube-api-access-n4z8k") pod "1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" (UID: "1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72"). InnerVolumeSpecName "kube-api-access-n4z8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:32:03 crc kubenswrapper[4790]: I0406 13:32:03.805722 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4z8k\" (UniqueName: \"kubernetes.io/projected/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72-kube-api-access-n4z8k\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:04 crc kubenswrapper[4790]: I0406 13:32:04.261712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591372-xgz22" event={"ID":"1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72","Type":"ContainerDied","Data":"9f2803a4b6d6d679b74bc7ea39f0fed4087d0533957af88ef17a15bf7878ac6c"} Apr 06 13:32:04 crc kubenswrapper[4790]: I0406 13:32:04.261748 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2803a4b6d6d679b74bc7ea39f0fed4087d0533957af88ef17a15bf7878ac6c" Apr 06 13:32:04 crc kubenswrapper[4790]: I0406 13:32:04.261767 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591372-xgz22" Apr 06 13:32:04 crc kubenswrapper[4790]: I0406 13:32:04.699306 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591366-jmb9d"] Apr 06 13:32:04 crc kubenswrapper[4790]: I0406 13:32:04.708182 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591366-jmb9d"] Apr 06 13:32:05 crc kubenswrapper[4790]: I0406 13:32:05.687910 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b60f78b-9dba-4317-991d-9fb9f6c9c5a5" path="/var/lib/kubelet/pods/4b60f78b-9dba-4317-991d-9fb9f6c9c5a5/volumes" Apr 06 13:32:19 crc kubenswrapper[4790]: I0406 13:32:19.565298 4790 scope.go:117] "RemoveContainer" containerID="56ea50d8a5ce16b250f0d82840d50c865ad2d96990bc56e260b61f1885122e03" Apr 06 13:32:22 crc kubenswrapper[4790]: I0406 13:32:22.644004 4790 generic.go:334] "Generic (PLEG): container finished" podID="c58fe7b4-f5be-433f-8390-67dd8a62e81b" containerID="3b224b5eb6b48dd29ac414b0ba77ad2f55e7a13fff00ae7f3439bd27bdc0a18e" exitCode=0 Apr 06 13:32:22 crc kubenswrapper[4790]: I0406 13:32:22.644312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c58fe7b4-f5be-433f-8390-67dd8a62e81b","Type":"ContainerDied","Data":"3b224b5eb6b48dd29ac414b0ba77ad2f55e7a13fff00ae7f3439bd27bdc0a18e"} Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.045230 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198117 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198219 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198237 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198256 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198376 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198465 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.198480 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d75w\" (UniqueName: \"kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w\") pod \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\" (UID: \"c58fe7b4-f5be-433f-8390-67dd8a62e81b\") " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.199163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.199942 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data" (OuterVolumeSpecName: "config-data") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.203885 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.207129 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w" (OuterVolumeSpecName: "kube-api-access-2d75w") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "kube-api-access-2d75w". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.207921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.230580 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.233968 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.256965 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.257030 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c58fe7b4-f5be-433f-8390-67dd8a62e81b" (UID: "c58fe7b4-f5be-433f-8390-67dd8a62e81b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300349 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300381 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-config-data\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300392 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d75w\" (UniqueName: \"kubernetes.io/projected/c58fe7b4-f5be-433f-8390-67dd8a62e81b-kube-api-access-2d75w\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300402 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ssh-key\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300411 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300419 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300428 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300436 4790 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c58fe7b4-f5be-433f-8390-67dd8a62e81b-ca-certs\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.300444 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c58fe7b4-f5be-433f-8390-67dd8a62e81b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.332064 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.402613 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.668913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c58fe7b4-f5be-433f-8390-67dd8a62e81b","Type":"ContainerDied","Data":"ba408e81d04031aeaf4386063a5af55e690500a4f14ce64eaa6ec7a3b34d9d26"} Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.669004 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba408e81d04031aeaf4386063a5af55e690500a4f14ce64eaa6ec7a3b34d9d26" Apr 06 13:32:24 crc kubenswrapper[4790]: I0406 13:32:24.668943 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.180875 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 06 13:32:36 crc kubenswrapper[4790]: E0406 13:32:36.182004 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" containerName="oc" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.182025 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" containerName="oc" Apr 06 13:32:36 crc kubenswrapper[4790]: E0406 13:32:36.182058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fe7b4-f5be-433f-8390-67dd8a62e81b" containerName="tempest-tests-tempest-tests-runner" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.182066 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fe7b4-f5be-433f-8390-67dd8a62e81b" containerName="tempest-tests-tempest-tests-runner" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.182320 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fe7b4-f5be-433f-8390-67dd8a62e81b" containerName="tempest-tests-tempest-tests-runner" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.182336 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" containerName="oc" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.183212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.185382 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-72kbb" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.190958 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.368154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.368226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knsk\" (UniqueName: \"kubernetes.io/projected/fe61003b-b427-4b3b-8af3-a4f9e0cf8605-kube-api-access-2knsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.470488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.470562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knsk\" (UniqueName: \"kubernetes.io/projected/fe61003b-b427-4b3b-8af3-a4f9e0cf8605-kube-api-access-2knsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.471128 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.492035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knsk\" (UniqueName: \"kubernetes.io/projected/fe61003b-b427-4b3b-8af3-a4f9e0cf8605-kube-api-access-2knsk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.501259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fe61003b-b427-4b3b-8af3-a4f9e0cf8605\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.511093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.771721 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Apr 06 13:32:36 crc kubenswrapper[4790]: I0406 13:32:36.818709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fe61003b-b427-4b3b-8af3-a4f9e0cf8605","Type":"ContainerStarted","Data":"eed54aea6474bee73d4e1b6476eeddd8ce26e1439d217111a2cfd10b49c3b091"} Apr 06 13:32:38 crc kubenswrapper[4790]: I0406 13:32:38.840199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fe61003b-b427-4b3b-8af3-a4f9e0cf8605","Type":"ContainerStarted","Data":"ec7905bae62720bafb1a65d96eb39eddbd2272d18b554df0ac0a6feef1d5823d"} Apr 06 13:32:38 crc kubenswrapper[4790]: I0406 13:32:38.858467 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.139471928 podStartE2EDuration="2.858449799s" podCreationTimestamp="2026-04-06 13:32:36 +0000 UTC" firstStartedPulling="2026-04-06 13:32:36.788712805 +0000 UTC m=+5735.776455681" lastFinishedPulling="2026-04-06 13:32:38.507690686 +0000 UTC m=+5737.495433552" observedRunningTime="2026-04-06 13:32:38.853820166 +0000 UTC m=+5737.841563032" watchObservedRunningTime="2026-04-06 13:32:38.858449799 +0000 UTC m=+5737.846192655" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.452370 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zks8r/must-gather-78bwh"] Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.464184 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.470345 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zks8r"/"openshift-service-ca.crt" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.470621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zks8r"/"kube-root-ca.crt" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.470777 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zks8r"/"default-dockercfg-ks8tw" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.470781 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zks8r/must-gather-78bwh"] Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.593156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkzjf\" (UniqueName: \"kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.593463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.695388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkzjf\" (UniqueName: \"kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.695504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.696022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.718239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkzjf\" (UniqueName: \"kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf\") pod \"must-gather-78bwh\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:06 crc kubenswrapper[4790]: I0406 13:33:06.784448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:33:07 crc kubenswrapper[4790]: I0406 13:33:07.369734 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zks8r/must-gather-78bwh"] Apr 06 13:33:08 crc kubenswrapper[4790]: I0406 13:33:08.188445 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/must-gather-78bwh" event={"ID":"3a74e336-4367-4e9a-ac78-f69716c33ae0","Type":"ContainerStarted","Data":"2bdbef5bcb89a58c00a8b27b1899ff50e82f3b126124f98c37205fa6f5635d52"} Apr 06 13:33:09 crc kubenswrapper[4790]: I0406 13:33:09.753929 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:33:09 crc kubenswrapper[4790]: I0406 13:33:09.754341 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:33:14 crc kubenswrapper[4790]: I0406 13:33:14.270574 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/must-gather-78bwh" event={"ID":"3a74e336-4367-4e9a-ac78-f69716c33ae0","Type":"ContainerStarted","Data":"9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8"} Apr 06 13:33:14 crc kubenswrapper[4790]: I0406 13:33:14.271088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/must-gather-78bwh" event={"ID":"3a74e336-4367-4e9a-ac78-f69716c33ae0","Type":"ContainerStarted","Data":"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd"} Apr 06 13:33:14 crc kubenswrapper[4790]: I0406 13:33:14.324963 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zks8r/must-gather-78bwh" podStartSLOduration=2.438613694 podStartE2EDuration="8.32493838s" podCreationTimestamp="2026-04-06 13:33:06 +0000 UTC" firstStartedPulling="2026-04-06 13:33:07.37397066 +0000 UTC m=+5766.361713526" lastFinishedPulling="2026-04-06 13:33:13.260295346 +0000 UTC m=+5772.248038212" observedRunningTime="2026-04-06 13:33:14.317646445 +0000 UTC m=+5773.305389311" watchObservedRunningTime="2026-04-06 13:33:14.32493838 +0000 UTC m=+5773.312681246" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.735343 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zks8r/crc-debug-gtdss"] Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.737669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.830434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckn7j\" (UniqueName: \"kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.830918 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.933328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckn7j\" (UniqueName: \"kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.933477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.933636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:17 crc kubenswrapper[4790]: I0406 13:33:17.964630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckn7j\" (UniqueName: \"kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j\") pod \"crc-debug-gtdss\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:18 crc kubenswrapper[4790]: I0406 13:33:18.066225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:33:18 crc kubenswrapper[4790]: W0406 13:33:18.097433 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cc5ee6_1b20_486d_87bd_5727224adf2f.slice/crio-75f11f785a6025c935afed335525e250a64a2a029d82553402013a91d7c2a7fc WatchSource:0}: Error finding container 75f11f785a6025c935afed335525e250a64a2a029d82553402013a91d7c2a7fc: Status 404 returned error can't find the container with id 75f11f785a6025c935afed335525e250a64a2a029d82553402013a91d7c2a7fc Apr 06 13:33:18 crc kubenswrapper[4790]: I0406 13:33:18.305840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-gtdss" event={"ID":"e2cc5ee6-1b20-486d-87bd-5727224adf2f","Type":"ContainerStarted","Data":"75f11f785a6025c935afed335525e250a64a2a029d82553402013a91d7c2a7fc"} Apr 06 13:33:29 crc kubenswrapper[4790]: I0406 13:33:29.425662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-gtdss" event={"ID":"e2cc5ee6-1b20-486d-87bd-5727224adf2f","Type":"ContainerStarted","Data":"07cddb31488b04e7875619ad5b2ebf5f67fd1da3b2750a89b7e8ed50a94444f5"} Apr 06 13:33:29 crc kubenswrapper[4790]: I0406 13:33:29.443678 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zks8r/crc-debug-gtdss" podStartSLOduration=2.014986183 podStartE2EDuration="12.443661828s" podCreationTimestamp="2026-04-06 13:33:17 +0000 UTC" firstStartedPulling="2026-04-06 13:33:18.099282831 +0000 UTC m=+5777.087025697" lastFinishedPulling="2026-04-06 13:33:28.527958476 +0000 UTC m=+5787.515701342" observedRunningTime="2026-04-06 13:33:29.439488756 +0000 UTC m=+5788.427231622" watchObservedRunningTime="2026-04-06 13:33:29.443661828 +0000 UTC m=+5788.431404694" Apr 06 13:33:39 crc kubenswrapper[4790]: I0406 13:33:39.753416 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:33:39 crc kubenswrapper[4790]: I0406 13:33:39.755149 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.144546 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591374-25knm"] Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.147248 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.149587 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.149682 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.149871 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.168316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591374-25knm"] Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.283624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52knk\" (UniqueName: \"kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk\") pod \"auto-csr-approver-29591374-25knm\" (UID: \"8916c697-9830-4c0d-9faf-5d5f9195d5b1\") " pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.385444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52knk\" (UniqueName: \"kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk\") pod \"auto-csr-approver-29591374-25knm\" (UID: \"8916c697-9830-4c0d-9faf-5d5f9195d5b1\") " pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.410223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52knk\" (UniqueName: \"kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk\") pod \"auto-csr-approver-29591374-25knm\" (UID: \"8916c697-9830-4c0d-9faf-5d5f9195d5b1\") " pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:00 crc kubenswrapper[4790]: I0406 13:34:00.479709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:01 crc kubenswrapper[4790]: I0406 13:34:01.025875 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591374-25knm"] Apr 06 13:34:01 crc kubenswrapper[4790]: I0406 13:34:01.031999 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:34:01 crc kubenswrapper[4790]: I0406 13:34:01.756679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591374-25knm" event={"ID":"8916c697-9830-4c0d-9faf-5d5f9195d5b1","Type":"ContainerStarted","Data":"6957173cbe3fea46bde9b50d0918f4136d65c20581d22407e63317e300d7e36f"} Apr 06 13:34:02 crc kubenswrapper[4790]: I0406 13:34:02.770480 4790 generic.go:334] "Generic (PLEG): container finished" podID="8916c697-9830-4c0d-9faf-5d5f9195d5b1" containerID="95e9b1685219b458c1bb9d787a1aa88cda6a3816fa63a3fe26cdb21750f367d8" exitCode=0 Apr 06 13:34:02 crc kubenswrapper[4790]: I0406 13:34:02.770549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591374-25knm" event={"ID":"8916c697-9830-4c0d-9faf-5d5f9195d5b1","Type":"ContainerDied","Data":"95e9b1685219b458c1bb9d787a1aa88cda6a3816fa63a3fe26cdb21750f367d8"} Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.165994 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.171649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52knk\" (UniqueName: \"kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk\") pod \"8916c697-9830-4c0d-9faf-5d5f9195d5b1\" (UID: \"8916c697-9830-4c0d-9faf-5d5f9195d5b1\") " Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.189976 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk" (OuterVolumeSpecName: "kube-api-access-52knk") pod "8916c697-9830-4c0d-9faf-5d5f9195d5b1" (UID: "8916c697-9830-4c0d-9faf-5d5f9195d5b1"). InnerVolumeSpecName "kube-api-access-52knk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.274340 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52knk\" (UniqueName: \"kubernetes.io/projected/8916c697-9830-4c0d-9faf-5d5f9195d5b1-kube-api-access-52knk\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.808576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591374-25knm" event={"ID":"8916c697-9830-4c0d-9faf-5d5f9195d5b1","Type":"ContainerDied","Data":"6957173cbe3fea46bde9b50d0918f4136d65c20581d22407e63317e300d7e36f"} Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.808638 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6957173cbe3fea46bde9b50d0918f4136d65c20581d22407e63317e300d7e36f" Apr 06 13:34:04 crc kubenswrapper[4790]: I0406 13:34:04.808725 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591374-25knm" Apr 06 13:34:05 crc kubenswrapper[4790]: I0406 13:34:05.240488 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591368-7jnfp"] Apr 06 13:34:05 crc kubenswrapper[4790]: I0406 13:34:05.252617 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591368-7jnfp"] Apr 06 13:34:05 crc kubenswrapper[4790]: I0406 13:34:05.688037 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a568713-745e-4682-9f81-c00c827bdb56" path="/var/lib/kubelet/pods/3a568713-745e-4682-9f81-c00c827bdb56/volumes" Apr 06 13:34:09 crc kubenswrapper[4790]: I0406 13:34:09.753127 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:34:09 crc kubenswrapper[4790]: I0406 13:34:09.753468 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:34:09 crc kubenswrapper[4790]: I0406 13:34:09.753511 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:34:09 crc kubenswrapper[4790]: I0406 13:34:09.754244 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:34:09 crc kubenswrapper[4790]: I0406 13:34:09.754287 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460" gracePeriod=600 Apr 06 13:34:10 crc kubenswrapper[4790]: I0406 13:34:10.863465 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460" exitCode=0 Apr 06 13:34:10 crc kubenswrapper[4790]: I0406 13:34:10.863995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460"} Apr 06 13:34:10 crc kubenswrapper[4790]: I0406 13:34:10.865166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149"} Apr 06 13:34:10 crc kubenswrapper[4790]: I0406 13:34:10.865198 4790 scope.go:117] "RemoveContainer" containerID="68bf0cc180a41cdb44bd66d8864b44cd6dc90809ec6d75d9c54dc1f64162a200" Apr 06 13:34:12 crc kubenswrapper[4790]: I0406 13:34:12.888440 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2cc5ee6-1b20-486d-87bd-5727224adf2f" containerID="07cddb31488b04e7875619ad5b2ebf5f67fd1da3b2750a89b7e8ed50a94444f5" exitCode=0 Apr 06 13:34:12 crc kubenswrapper[4790]: I0406 13:34:12.888969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-gtdss" event={"ID":"e2cc5ee6-1b20-486d-87bd-5727224adf2f","Type":"ContainerDied","Data":"07cddb31488b04e7875619ad5b2ebf5f67fd1da3b2750a89b7e8ed50a94444f5"} Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.022375 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.060086 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-gtdss"] Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.069630 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-gtdss"] Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.178763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckn7j\" (UniqueName: \"kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j\") pod \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.179393 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host\") pod \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\" (UID: \"e2cc5ee6-1b20-486d-87bd-5727224adf2f\") " Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.179583 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host" (OuterVolumeSpecName: "host") pod "e2cc5ee6-1b20-486d-87bd-5727224adf2f" (UID: "e2cc5ee6-1b20-486d-87bd-5727224adf2f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.180572 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cc5ee6-1b20-486d-87bd-5727224adf2f-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.200072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j" (OuterVolumeSpecName: "kube-api-access-ckn7j") pod "e2cc5ee6-1b20-486d-87bd-5727224adf2f" (UID: "e2cc5ee6-1b20-486d-87bd-5727224adf2f"). InnerVolumeSpecName "kube-api-access-ckn7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.282812 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckn7j\" (UniqueName: \"kubernetes.io/projected/e2cc5ee6-1b20-486d-87bd-5727224adf2f-kube-api-access-ckn7j\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.910344 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f11f785a6025c935afed335525e250a64a2a029d82553402013a91d7c2a7fc" Apr 06 13:34:14 crc kubenswrapper[4790]: I0406 13:34:14.910413 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-gtdss" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.236021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zks8r/crc-debug-z7hsk"] Apr 06 13:34:15 crc kubenswrapper[4790]: E0406 13:34:15.236457 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8916c697-9830-4c0d-9faf-5d5f9195d5b1" containerName="oc" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.236471 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8916c697-9830-4c0d-9faf-5d5f9195d5b1" containerName="oc" Apr 06 13:34:15 crc kubenswrapper[4790]: E0406 13:34:15.236496 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cc5ee6-1b20-486d-87bd-5727224adf2f" containerName="container-00" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.236503 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cc5ee6-1b20-486d-87bd-5727224adf2f" containerName="container-00" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.236709 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8916c697-9830-4c0d-9faf-5d5f9195d5b1" containerName="oc" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.236726 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cc5ee6-1b20-486d-87bd-5727224adf2f" containerName="container-00" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.237390 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.406516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs4rc\" (UniqueName: \"kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.406560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.508248 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs4rc\" (UniqueName: \"kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.508598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.508705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.529254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs4rc\" (UniqueName: \"kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc\") pod \"crc-debug-z7hsk\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.555087 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.701129 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cc5ee6-1b20-486d-87bd-5727224adf2f" path="/var/lib/kubelet/pods/e2cc5ee6-1b20-486d-87bd-5727224adf2f/volumes" Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.920507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" event={"ID":"adb07307-bf6c-4736-bfb1-d4d21a20815d","Type":"ContainerStarted","Data":"c2ebaf01f6c99e530c4383ca9297587ce3794532cc9cfac0346bb0276e1a0700"} Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.920865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" event={"ID":"adb07307-bf6c-4736-bfb1-d4d21a20815d","Type":"ContainerStarted","Data":"3a3d9474e1e6eeb08ead77d0ab5e574413983009062764d9cc09f03623b7efb1"} Apr 06 13:34:15 crc kubenswrapper[4790]: I0406 13:34:15.944750 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" podStartSLOduration=0.944729273 podStartE2EDuration="944.729273ms" podCreationTimestamp="2026-04-06 13:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 13:34:15.937044707 +0000 UTC m=+5834.924787573" watchObservedRunningTime="2026-04-06 13:34:15.944729273 +0000 UTC m=+5834.932472139" Apr 06 13:34:16 crc kubenswrapper[4790]: I0406 13:34:16.938443 4790 generic.go:334] "Generic (PLEG): container finished" podID="adb07307-bf6c-4736-bfb1-d4d21a20815d" containerID="c2ebaf01f6c99e530c4383ca9297587ce3794532cc9cfac0346bb0276e1a0700" exitCode=0 Apr 06 13:34:16 crc kubenswrapper[4790]: I0406 13:34:16.938681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" event={"ID":"adb07307-bf6c-4736-bfb1-d4d21a20815d","Type":"ContainerDied","Data":"c2ebaf01f6c99e530c4383ca9297587ce3794532cc9cfac0346bb0276e1a0700"} Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.090235 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.268646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host\") pod \"adb07307-bf6c-4736-bfb1-d4d21a20815d\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.268706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host" (OuterVolumeSpecName: "host") pod "adb07307-bf6c-4736-bfb1-d4d21a20815d" (UID: "adb07307-bf6c-4736-bfb1-d4d21a20815d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.268854 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs4rc\" (UniqueName: \"kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc\") pod \"adb07307-bf6c-4736-bfb1-d4d21a20815d\" (UID: \"adb07307-bf6c-4736-bfb1-d4d21a20815d\") " Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.269722 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adb07307-bf6c-4736-bfb1-d4d21a20815d-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.277486 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc" (OuterVolumeSpecName: "kube-api-access-vs4rc") pod "adb07307-bf6c-4736-bfb1-d4d21a20815d" (UID: "adb07307-bf6c-4736-bfb1-d4d21a20815d"). InnerVolumeSpecName "kube-api-access-vs4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.389046 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs4rc\" (UniqueName: \"kubernetes.io/projected/adb07307-bf6c-4736-bfb1-d4d21a20815d-kube-api-access-vs4rc\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.725792 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-z7hsk"] Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.735710 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-z7hsk"] Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.961415 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a3d9474e1e6eeb08ead77d0ab5e574413983009062764d9cc09f03623b7efb1" Apr 06 13:34:18 crc kubenswrapper[4790]: I0406 13:34:18.961490 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-z7hsk" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.678603 4790 scope.go:117] "RemoveContainer" containerID="57fe435b53f4156ba6ae3bbffc8b50b940c5bf26a317d7e690acbcf01998df4b" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.697008 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb07307-bf6c-4736-bfb1-d4d21a20815d" path="/var/lib/kubelet/pods/adb07307-bf6c-4736-bfb1-d4d21a20815d/volumes" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.940863 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zks8r/crc-debug-8nj5n"] Apr 06 13:34:19 crc kubenswrapper[4790]: E0406 13:34:19.941280 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb07307-bf6c-4736-bfb1-d4d21a20815d" containerName="container-00" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.941300 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb07307-bf6c-4736-bfb1-d4d21a20815d" containerName="container-00" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.941548 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb07307-bf6c-4736-bfb1-d4d21a20815d" containerName="container-00" Apr 06 13:34:19 crc kubenswrapper[4790]: I0406 13:34:19.942515 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.123663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.124192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kr2j\" (UniqueName: \"kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.226380 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kr2j\" (UniqueName: \"kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.226820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.226987 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.263183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kr2j\" (UniqueName: \"kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j\") pod \"crc-debug-8nj5n\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.556985 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.980802 4790 generic.go:334] "Generic (PLEG): container finished" podID="4008298d-2537-40af-ae5d-48111632a8c8" containerID="08ae41c10e84810ff8e1ad38a3af33e2dbd8627252beeffdb3e85661204f1f71" exitCode=0 Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.980872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" event={"ID":"4008298d-2537-40af-ae5d-48111632a8c8","Type":"ContainerDied","Data":"08ae41c10e84810ff8e1ad38a3af33e2dbd8627252beeffdb3e85661204f1f71"} Apr 06 13:34:20 crc kubenswrapper[4790]: I0406 13:34:20.981170 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" event={"ID":"4008298d-2537-40af-ae5d-48111632a8c8","Type":"ContainerStarted","Data":"26548c696670161f420447ee99d5e5d5c8e9ce13d63bd21cdc423cc89cc7caea"} Apr 06 13:34:21 crc kubenswrapper[4790]: I0406 13:34:21.028181 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-8nj5n"] Apr 06 13:34:21 crc kubenswrapper[4790]: I0406 13:34:21.039193 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zks8r/crc-debug-8nj5n"] Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.107088 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.273796 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kr2j\" (UniqueName: \"kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j\") pod \"4008298d-2537-40af-ae5d-48111632a8c8\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.274200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host\") pod \"4008298d-2537-40af-ae5d-48111632a8c8\" (UID: \"4008298d-2537-40af-ae5d-48111632a8c8\") " Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.274277 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host" (OuterVolumeSpecName: "host") pod "4008298d-2537-40af-ae5d-48111632a8c8" (UID: "4008298d-2537-40af-ae5d-48111632a8c8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.275399 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4008298d-2537-40af-ae5d-48111632a8c8-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.279880 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j" (OuterVolumeSpecName: "kube-api-access-8kr2j") pod "4008298d-2537-40af-ae5d-48111632a8c8" (UID: "4008298d-2537-40af-ae5d-48111632a8c8"). InnerVolumeSpecName "kube-api-access-8kr2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:34:22 crc kubenswrapper[4790]: I0406 13:34:22.376501 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kr2j\" (UniqueName: \"kubernetes.io/projected/4008298d-2537-40af-ae5d-48111632a8c8-kube-api-access-8kr2j\") on node \"crc\" DevicePath \"\"" Apr 06 13:34:23 crc kubenswrapper[4790]: I0406 13:34:23.000754 4790 scope.go:117] "RemoveContainer" containerID="08ae41c10e84810ff8e1ad38a3af33e2dbd8627252beeffdb3e85661204f1f71" Apr 06 13:34:23 crc kubenswrapper[4790]: I0406 13:34:23.000908 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/crc-debug-8nj5n" Apr 06 13:34:23 crc kubenswrapper[4790]: I0406 13:34:23.704320 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4008298d-2537-40af-ae5d-48111632a8c8" path="/var/lib/kubelet/pods/4008298d-2537-40af-ae5d-48111632a8c8/volumes" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.762301 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:34:49 crc kubenswrapper[4790]: E0406 13:34:49.763431 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4008298d-2537-40af-ae5d-48111632a8c8" containerName="container-00" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.763448 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4008298d-2537-40af-ae5d-48111632a8c8" containerName="container-00" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.763705 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4008298d-2537-40af-ae5d-48111632a8c8" containerName="container-00" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.765593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.778654 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.934699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.934783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:49 crc kubenswrapper[4790]: I0406 13:34:49.935288 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7sl\" (UniqueName: \"kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.036552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7sl\" (UniqueName: \"kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.036666 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.036699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.037267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.037363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.076191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7sl\" (UniqueName: \"kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl\") pod \"certified-operators-d5qfp\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.103128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:34:50 crc kubenswrapper[4790]: I0406 13:34:50.637331 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:34:51 crc kubenswrapper[4790]: I0406 13:34:51.284464 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerID="67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90" exitCode=0 Apr 06 13:34:51 crc kubenswrapper[4790]: I0406 13:34:51.284572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerDied","Data":"67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90"} Apr 06 13:34:51 crc kubenswrapper[4790]: I0406 13:34:51.284891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerStarted","Data":"2557d3e3cf078c3103aa3c6be370dfae0a80113680feb0c39324ae604b562928"} Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.156808 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57899578c6-rh848_e2ee43de-a608-4710-a58d-60d49845cb7c/barbican-api/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.272178 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57899578c6-rh848_e2ee43de-a608-4710-a58d-60d49845cb7c/barbican-api-log/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.334693 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerStarted","Data":"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a"} Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.358323 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbdd57f54-rd6dk_d431242a-f2f0-4780-85d0-9f2cfc8573ac/barbican-keystone-listener/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.458706 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbdd57f54-rd6dk_d431242a-f2f0-4780-85d0-9f2cfc8573ac/barbican-keystone-listener-log/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.575051 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d568b5b57-5x8cp_2f7309c8-fdde-4e0e-9efa-ece286501ec5/barbican-worker/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.604837 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d568b5b57-5x8cp_2f7309c8-fdde-4e0e-9efa-ece286501ec5/barbican-worker-log/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.931501 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/ceilometer-central-agent/0.log" Apr 06 13:34:53 crc kubenswrapper[4790]: I0406 13:34:53.998460 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n_34fd07c3-5b1c-440c-a0d0-3d9423f40cc8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.056701 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/ceilometer-notification-agent/0.log" Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.171516 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/sg-core/0.log" Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.215061 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/proxy-httpd/0.log" Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.362111 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerID="c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a" exitCode=0 Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.362147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerDied","Data":"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a"} Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.473110 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6acbb602-fcaa-448f-bc7a-49a2ac2bb979/cinder-api-log/0.log" Apr 06 13:34:54 crc kubenswrapper[4790]: I0406 13:34:54.840554 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cd529dba-04e1-45bf-9a0a-69fd93502cd9/probe/0.log" Apr 06 13:34:55 crc kubenswrapper[4790]: I0406 13:34:55.117531 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_207e3a4b-b763-47f7-b2f7-b25c8c929af5/cinder-scheduler/0.log" Apr 06 13:34:55 crc kubenswrapper[4790]: I0406 13:34:55.380933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerStarted","Data":"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9"} Apr 06 13:34:55 crc kubenswrapper[4790]: I0406 13:34:55.385599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_207e3a4b-b763-47f7-b2f7-b25c8c929af5/probe/0.log" Apr 06 13:34:55 crc kubenswrapper[4790]: I0406 13:34:55.416943 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5qfp" podStartSLOduration=2.976846408 podStartE2EDuration="6.416923917s" podCreationTimestamp="2026-04-06 13:34:49 +0000 UTC" firstStartedPulling="2026-04-06 13:34:51.287058812 +0000 UTC m=+5870.274801718" lastFinishedPulling="2026-04-06 13:34:54.727136371 +0000 UTC m=+5873.714879227" observedRunningTime="2026-04-06 13:34:55.408979164 +0000 UTC m=+5874.396722020" watchObservedRunningTime="2026-04-06 13:34:55.416923917 +0000 UTC m=+5874.404666783" Apr 06 13:34:55 crc kubenswrapper[4790]: I0406 13:34:55.747251 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cd529dba-04e1-45bf-9a0a-69fd93502cd9/cinder-backup/0.log" Apr 06 13:34:56 crc kubenswrapper[4790]: I0406 13:34:56.076272 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_f77070f6-164c-4bec-aafa-6126ca005702/probe/0.log" Apr 06 13:34:56 crc kubenswrapper[4790]: I0406 13:34:56.140919 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6acbb602-fcaa-448f-bc7a-49a2ac2bb979/cinder-api/0.log" Apr 06 13:34:56 crc kubenswrapper[4790]: I0406 13:34:56.346931 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_da98578a-8aaa-403a-8f8e-4c7115cfa2cb/probe/0.log" Apr 06 13:34:56 crc kubenswrapper[4790]: I0406 13:34:56.387537 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_f77070f6-164c-4bec-aafa-6126ca005702/cinder-volume/0.log" Apr 06 13:34:56 crc kubenswrapper[4790]: I0406 13:34:56.876560 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp_4e61f8b4-0263-4224-a5e0-b34740fbca06/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.077371 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/init/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.082969 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_da98578a-8aaa-403a-8f8e-4c7115cfa2cb/cinder-volume/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.092400 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fglxr_f694367f-e4c0-49b1-99f2-f22624011595/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.271606 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/init/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.552537 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt_f73a2e40-f5e3-4e0e-9244-c076b36e911e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.569729 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/dnsmasq-dns/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.637274 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3c46af-c7be-417d-9a92-454f74da7a82/glance-httpd/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.758031 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3c46af-c7be-417d-9a92-454f74da7a82/glance-log/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.819673 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f86cbe3-945a-4c2a-8986-aa0443e28b95/glance-httpd/0.log" Apr 06 13:34:57 crc kubenswrapper[4790]: I0406 13:34:57.860284 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f86cbe3-945a-4c2a-8986-aa0443e28b95/glance-log/0.log" Apr 06 13:34:58 crc kubenswrapper[4790]: I0406 13:34:58.109909 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9_eb888063-8b8f-43f5-ba22-d7fc374a1bbf/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:58 crc kubenswrapper[4790]: I0406 13:34:58.494710 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29591341-jdpjh_7657bf4f-84d6-4cc0-97da-ac70e2aa07de/keystone-cron/0.log" Apr 06 13:34:58 crc kubenswrapper[4790]: I0406 13:34:58.700123 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d656612b-aaad-4d40-bc05-3aae06b509f3/kube-state-metrics/0.log" Apr 06 13:34:58 crc kubenswrapper[4790]: I0406 13:34:58.724246 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fph2d_500a529e-70f2-4749-8364-d6a6230b0030/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:34:58 crc kubenswrapper[4790]: I0406 13:34:58.968246 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b9b7c8b58-lzkxg_b215a7cd-f428-4fbd-adbc-307b6c905894/keystone-api/0.log" Apr 06 13:34:59 crc kubenswrapper[4790]: I0406 13:34:59.488262 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bbf455d7c-c2ssf_cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df/neutron-httpd/0.log" Apr 06 13:34:59 crc kubenswrapper[4790]: I0406 13:34:59.662518 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bbf455d7c-c2ssf_cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df/neutron-api/0.log" Apr 06 13:34:59 crc kubenswrapper[4790]: I0406 13:34:59.887640 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4_9a366552-b5b9-4a9c-92a2-8b63981f5520/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.103227 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.104036 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.146102 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/setup-container/0.log" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.180912 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-48qfz_75d5d9f7-8482-4fb7-a536-55656709bec2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.848777 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.850884 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.853605 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/rabbitmq/0.log" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.871666 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.904045 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/setup-container/0.log" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.995355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlzk\" (UniqueName: \"kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.995411 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:00 crc kubenswrapper[4790]: I0406 13:35:00.995488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.097033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlzk\" (UniqueName: \"kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.097080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.097146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.097685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.098212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.135961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlzk\" (UniqueName: \"kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk\") pod \"redhat-marketplace-vmt29\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.160789 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d5qfp" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="registry-server" probeResult="failure" output=< Apr 06 13:35:01 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:35:01 crc kubenswrapper[4790]: > Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.192820 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.765252 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:01 crc kubenswrapper[4790]: I0406 13:35:01.836254 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_001ce2db-6829-4dcc-bf3a-b19134cd3484/nova-cell0-conductor-conductor/0.log" Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.199874 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_96ad1416-d1af-42b2-8fae-68574044a5e6/nova-cell1-conductor-conductor/0.log" Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.479248 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerID="14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14" exitCode=0 Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.479301 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerDied","Data":"14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14"} Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.479331 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerStarted","Data":"21b31abded3fa37a5a1a9df2fbe8c140cf727f28f488bfb8ef524427d827503a"} Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.525835 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8c07d38c-a3ad-48d8-948c-7659351eade5/nova-cell1-novncproxy-novncproxy/0.log" Apr 06 13:35:02 crc kubenswrapper[4790]: I0406 13:35:02.888863 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2cd7a2b3-4c64-4d09-9865-cd55277fd369/nova-api-log/0.log" Apr 06 13:35:03 crc kubenswrapper[4790]: I0406 13:35:03.047165 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8fa33d66-ad99-4650-bc60-a97e16cbd064/nova-metadata-log/0.log" Apr 06 13:35:03 crc kubenswrapper[4790]: I0406 13:35:03.469189 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2cd7a2b3-4c64-4d09-9865-cd55277fd369/nova-api-api/0.log" Apr 06 13:35:03 crc kubenswrapper[4790]: I0406 13:35:03.942647 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a73b707d-e57e-4f4c-a253-38e55128a1b2/nova-scheduler-scheduler/0.log" Apr 06 13:35:03 crc kubenswrapper[4790]: I0406 13:35:03.995548 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/mysql-bootstrap/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.062035 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8fa33d66-ad99-4650-bc60-a97e16cbd064/nova-metadata-metadata/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.066841 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-w9z4v_9be120eb-568b-4ab4-af61-b92818e7e6ad/nova-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.141352 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/mysql-bootstrap/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.229217 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/galera/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.379288 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/mysql-bootstrap/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.506974 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerStarted","Data":"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49"} Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.565238 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/galera/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.589982 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/mysql-bootstrap/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.595616 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1eea2e2a-d9c2-46e2-96a9-827fcf5a075f/openstackclient/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.829784 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lw6ch_77333973-0908-43d8-8105-0c3b3e5cdecb/openstack-network-exporter/0.log" Apr 06 13:35:04 crc kubenswrapper[4790]: I0406 13:35:04.905708 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server-init/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.176012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.190376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server-init/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.479787 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pr4b9_51949d72-301c-4426-8397-273f6b2ecabd/ovn-controller/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.520978 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerID="9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49" exitCode=0 Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.521032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerDied","Data":"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49"} Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.690332 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovs-vswitchd/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.787965 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nbzm5_7809a0fa-81df-4e08-8a8f-84e070582795/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.869806 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c66b653e-0e7d-44cb-82e1-1e2ee6a04b15/openstack-network-exporter/0.log" Apr 06 13:35:05 crc kubenswrapper[4790]: I0406 13:35:05.896205 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c66b653e-0e7d-44cb-82e1-1e2ee6a04b15/ovn-northd/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.053404 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6192fb44-8c5c-4bee-a190-cb14bce3fa94/openstack-network-exporter/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.245202 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6192fb44-8c5c-4bee-a190-cb14bce3fa94/ovsdbserver-nb/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.270058 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7fb6737-ce1d-42b7-96e4-f1ea27883d05/openstack-network-exporter/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.348386 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7fb6737-ce1d-42b7-96e4-f1ea27883d05/ovsdbserver-sb/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.538017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerStarted","Data":"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba"} Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.566122 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vmt29" podStartSLOduration=2.894432648 podStartE2EDuration="6.566098246s" podCreationTimestamp="2026-04-06 13:35:00 +0000 UTC" firstStartedPulling="2026-04-06 13:35:02.48178938 +0000 UTC m=+5881.469532246" lastFinishedPulling="2026-04-06 13:35:06.153454978 +0000 UTC m=+5885.141197844" observedRunningTime="2026-04-06 13:35:06.559405407 +0000 UTC m=+5885.547148293" watchObservedRunningTime="2026-04-06 13:35:06.566098246 +0000 UTC m=+5885.553841112" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.859584 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bffc46f4d-tqbdl_54d90c86-6e3b-49d3-a50f-eefe94ef8d6d/placement-api/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.887143 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bffc46f4d-tqbdl_54d90c86-6e3b-49d3-a50f-eefe94ef8d6d/placement-log/0.log" Apr 06 13:35:06 crc kubenswrapper[4790]: I0406 13:35:06.892012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/init-config-reloader/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.135642 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/init-config-reloader/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.153884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/config-reloader/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.179232 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/thanos-sidecar/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.304211 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/prometheus/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.448558 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/setup-container/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.691183 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/setup-container/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.771900 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/setup-container/0.log" Apr 06 13:35:07 crc kubenswrapper[4790]: I0406 13:35:07.786633 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/rabbitmq/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.006995 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/setup-container/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.030815 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s_822b5a1d-5bf4-4e66-87fa-20a47f8cd280/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.139562 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/rabbitmq/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.290060 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kwcth_c0c26165-8e10-4607-9cce-f36ec74bdc85/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.528810 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9_e671a5e8-99cd-4a96-a26f-93ff0eb8980c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.717516 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pmnvp_2cea3a85-493c-4732-9954-6a690708c4d1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:08 crc kubenswrapper[4790]: I0406 13:35:08.795766 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2w9cm_28b0126b-8513-425d-8079-b68b9cb73bdc/ssh-known-hosts-edpm-deployment/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.008117 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7695db8cdc-vs5bx_7922b939-a1e8-4c85-8eb0-fe3529f6469c/proxy-server/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.169325 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-x6pwr_8288b902-f791-4dce-b1c0-2afa8796712b/swift-ring-rebalance/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.204344 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7695db8cdc-vs5bx_7922b939-a1e8-4c85-8eb0-fe3529f6469c/proxy-httpd/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.286260 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-auditor/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.456609 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-reaper/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.500487 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-replicator/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.594355 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-auditor/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.604117 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-server/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.795746 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-replicator/0.log" Apr 06 13:35:09 crc kubenswrapper[4790]: I0406 13:35:09.871751 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-updater/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.059717 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-replicator/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.131811 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-server/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.131925 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-expirer/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.162250 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.162581 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-auditor/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.261040 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.424451 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.528216 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-updater/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.565471 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-server/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.646080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/rsync/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.691174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/swift-recon-cron/0.log" Apr 06 13:35:10 crc kubenswrapper[4790]: I0406 13:35:10.950135 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c58fe7b4-f5be-433f-8390-67dd8a62e81b/tempest-tests-tempest-tests-runner/0.log" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.155674 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fe61003b-b427-4b3b-8af3-a4f9e0cf8605/test-operator-logs-container/0.log" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.193180 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.193767 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.243875 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.315659 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pxptm_8e75f387-926a-41f4-8367-8c68d2637c04/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.410874 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-clrlg_b1da0dd0-f14d-4b72-8308-a256f237732f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.583091 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5qfp" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="registry-server" containerID="cri-o://e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9" gracePeriod=2 Apr 06 13:35:11 crc kubenswrapper[4790]: I0406 13:35:11.637963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.096681 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.253033 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr7sl\" (UniqueName: \"kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl\") pod \"3f4158ed-b160-4908-9ee0-a97b76d70717\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.253115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities\") pod \"3f4158ed-b160-4908-9ee0-a97b76d70717\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.253248 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content\") pod \"3f4158ed-b160-4908-9ee0-a97b76d70717\" (UID: \"3f4158ed-b160-4908-9ee0-a97b76d70717\") " Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.257582 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities" (OuterVolumeSpecName: "utilities") pod "3f4158ed-b160-4908-9ee0-a97b76d70717" (UID: "3f4158ed-b160-4908-9ee0-a97b76d70717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.292992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl" (OuterVolumeSpecName: "kube-api-access-jr7sl") pod "3f4158ed-b160-4908-9ee0-a97b76d70717" (UID: "3f4158ed-b160-4908-9ee0-a97b76d70717"). InnerVolumeSpecName "kube-api-access-jr7sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.319883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f4158ed-b160-4908-9ee0-a97b76d70717" (UID: "3f4158ed-b160-4908-9ee0-a97b76d70717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.355357 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.355386 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr7sl\" (UniqueName: \"kubernetes.io/projected/3f4158ed-b160-4908-9ee0-a97b76d70717-kube-api-access-jr7sl\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.355397 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4158ed-b160-4908-9ee0-a97b76d70717-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.426523 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_bed4cbed-be09-43cd-938a-e4a1fe5fe399/watcher-applier/0.log" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.595935 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerID="e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9" exitCode=0 Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.596121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerDied","Data":"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9"} Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.596157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5qfp" event={"ID":"3f4158ed-b160-4908-9ee0-a97b76d70717","Type":"ContainerDied","Data":"2557d3e3cf078c3103aa3c6be370dfae0a80113680feb0c39324ae604b562928"} Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.596182 4790 scope.go:117] "RemoveContainer" containerID="e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.596201 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5qfp" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.620820 4790 scope.go:117] "RemoveContainer" containerID="c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.655420 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.657853 4790 scope.go:117] "RemoveContainer" containerID="67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.669301 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5qfp"] Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.713072 4790 scope.go:117] "RemoveContainer" containerID="e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9" Apr 06 13:35:12 crc kubenswrapper[4790]: E0406 13:35:12.716306 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9\": container with ID starting with e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9 not found: ID does not exist" containerID="e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.716364 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9"} err="failed to get container status \"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9\": rpc error: code = NotFound desc = could not find container \"e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9\": container with ID starting with e694d9a8c9f10affde8fab22e5475ec5eb0355a6b44cb881ab15e5d20973ffd9 not found: ID does not exist" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.716407 4790 scope.go:117] "RemoveContainer" containerID="c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a" Apr 06 13:35:12 crc kubenswrapper[4790]: E0406 13:35:12.716876 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a\": container with ID starting with c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a not found: ID does not exist" containerID="c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.716908 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a"} err="failed to get container status \"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a\": rpc error: code = NotFound desc = could not find container \"c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a\": container with ID starting with c17d6e1550ca55321463971727e15aae5e1ef2d50ba442b879d5c4e249ed0f7a not found: ID does not exist" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.716928 4790 scope.go:117] "RemoveContainer" containerID="67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90" Apr 06 13:35:12 crc kubenswrapper[4790]: E0406 13:35:12.717402 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90\": container with ID starting with 67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90 not found: ID does not exist" containerID="67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.717433 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90"} err="failed to get container status \"67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90\": rpc error: code = NotFound desc = could not find container \"67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90\": container with ID starting with 67295bff27aef3eb382794d59334cf98e5332701bea7d56138597a89a4afeb90 not found: ID does not exist" Apr 06 13:35:12 crc kubenswrapper[4790]: I0406 13:35:12.808181 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:13 crc kubenswrapper[4790]: I0406 13:35:13.308636 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_b614b0dd-0285-4907-8e74-051e3ef0b3a1/watcher-api-log/0.log" Apr 06 13:35:13 crc kubenswrapper[4790]: I0406 13:35:13.688818 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" path="/var/lib/kubelet/pods/3f4158ed-b160-4908-9ee0-a97b76d70717/volumes" Apr 06 13:35:14 crc kubenswrapper[4790]: I0406 13:35:14.616527 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vmt29" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="registry-server" containerID="cri-o://fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba" gracePeriod=2 Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.201314 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.304054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvlzk\" (UniqueName: \"kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk\") pod \"3f445fe0-fb00-4a52-83a4-db9c7d660655\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.304247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities\") pod \"3f445fe0-fb00-4a52-83a4-db9c7d660655\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.304275 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content\") pod \"3f445fe0-fb00-4a52-83a4-db9c7d660655\" (UID: \"3f445fe0-fb00-4a52-83a4-db9c7d660655\") " Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.308215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities" (OuterVolumeSpecName: "utilities") pod "3f445fe0-fb00-4a52-83a4-db9c7d660655" (UID: "3f445fe0-fb00-4a52-83a4-db9c7d660655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.333009 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk" (OuterVolumeSpecName: "kube-api-access-cvlzk") pod "3f445fe0-fb00-4a52-83a4-db9c7d660655" (UID: "3f445fe0-fb00-4a52-83a4-db9c7d660655"). InnerVolumeSpecName "kube-api-access-cvlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.354878 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f445fe0-fb00-4a52-83a4-db9c7d660655" (UID: "3f445fe0-fb00-4a52-83a4-db9c7d660655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.406359 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvlzk\" (UniqueName: \"kubernetes.io/projected/3f445fe0-fb00-4a52-83a4-db9c7d660655-kube-api-access-cvlzk\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.406400 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.406413 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f445fe0-fb00-4a52-83a4-db9c7d660655-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.627693 4790 generic.go:334] "Generic (PLEG): container finished" podID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerID="fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba" exitCode=0 Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.627765 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmt29" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.627805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerDied","Data":"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba"} Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.628151 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmt29" event={"ID":"3f445fe0-fb00-4a52-83a4-db9c7d660655","Type":"ContainerDied","Data":"21b31abded3fa37a5a1a9df2fbe8c140cf727f28f488bfb8ef524427d827503a"} Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.628172 4790 scope.go:117] "RemoveContainer" containerID="fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.665970 4790 scope.go:117] "RemoveContainer" containerID="9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.683966 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.708582 4790 scope.go:117] "RemoveContainer" containerID="14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.739088 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmt29"] Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.752610 4790 scope.go:117] "RemoveContainer" containerID="fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba" Apr 06 13:35:15 crc kubenswrapper[4790]: E0406 13:35:15.753086 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba\": container with ID starting with fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba not found: ID does not exist" containerID="fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.753127 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba"} err="failed to get container status \"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba\": rpc error: code = NotFound desc = could not find container \"fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba\": container with ID starting with fc9feea275de1505bababb0c2413b57b27b159bd7c6bb17a09ebb8cc846052ba not found: ID does not exist" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.753152 4790 scope.go:117] "RemoveContainer" containerID="9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49" Apr 06 13:35:15 crc kubenswrapper[4790]: E0406 13:35:15.755099 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49\": container with ID starting with 9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49 not found: ID does not exist" containerID="9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.755140 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49"} err="failed to get container status \"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49\": rpc error: code = NotFound desc = could not find container \"9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49\": container with ID starting with 9f112d2b975729b726725728114de3cd53b16c8cb096e26c43eb6d1aa69f6b49 not found: ID does not exist" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.755165 4790 scope.go:117] "RemoveContainer" containerID="14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14" Apr 06 13:35:15 crc kubenswrapper[4790]: E0406 13:35:15.755996 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14\": container with ID starting with 14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14 not found: ID does not exist" containerID="14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14" Apr 06 13:35:15 crc kubenswrapper[4790]: I0406 13:35:15.756040 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14"} err="failed to get container status \"14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14\": rpc error: code = NotFound desc = could not find container \"14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14\": container with ID starting with 14207648e22b733bc5717673ac833d2d324153eb422af7495e2ea4e83e6f2a14 not found: ID does not exist" Apr 06 13:35:16 crc kubenswrapper[4790]: I0406 13:35:16.504130 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4/watcher-decision-engine/0.log" Apr 06 13:35:17 crc kubenswrapper[4790]: I0406 13:35:17.559662 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_b614b0dd-0285-4907-8e74-051e3ef0b3a1/watcher-api/0.log" Apr 06 13:35:17 crc kubenswrapper[4790]: I0406 13:35:17.688182 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" path="/var/lib/kubelet/pods/3f445fe0-fb00-4a52-83a4-db9c7d660655/volumes" Apr 06 13:35:25 crc kubenswrapper[4790]: I0406 13:35:25.552991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ace98862-e7bc-4eb8-93ae-b38dcbd98a55/memcached/0.log" Apr 06 13:35:42 crc kubenswrapper[4790]: I0406 13:35:42.600100 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bcc684c66-wv5tw_95b18d6e-ec5a-45e7-89c0-0f4618e4eb97/manager/0.log" Apr 06 13:35:42 crc kubenswrapper[4790]: I0406 13:35:42.714976 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:35:42 crc kubenswrapper[4790]: I0406 13:35:42.922879 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:35:42 crc kubenswrapper[4790]: I0406 13:35:42.950957 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:35:42 crc kubenswrapper[4790]: I0406 13:35:42.951303 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.305363 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/extract/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.338615 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.363413 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.556032 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78674bbc6b-48jqq_5878c1d4-78cf-447f-b442-f7a9aa1aee99/manager/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.572739 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58689c6fff-tm8b2_fb176575-b24c-4da4-a0f7-c5117d2c2ed7/manager/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.842282 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b5d8f8697-hwvv8_a8b6d51d-4671-471f-94c6-b0a4b2c4a27d/manager/0.log" Apr 06 13:35:43 crc kubenswrapper[4790]: I0406 13:35:43.864247 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8566787df9-l8dhs_16027ea9-802c-43ef-80ac-e2f66a2cc36b/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.021023 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6c5d8948dc-288vm_c2958357-3518-4e74-8326-cfe8cf23334f/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.241935 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b9c989bb6-z8t6s_75ce52d5-3320-40e1-8d64-42d12e2fa4c8/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.449638 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-88ccbfc66-9pp57_9430e8f1-17ec-4eff-8d9c-d54553956f8d/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.535894 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-dbf8bb784-6vkf8_30132a58-4c7d-4761-b73b-6d0ee27ea74e/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.652109 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fddf8d98f-qrxjw_ef128b3d-ea70-46cc-8928-0a557b6fbf5d/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.732684 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-765cb856bd-7vfjz_b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8/manager/0.log" Apr 06 13:35:44 crc kubenswrapper[4790]: I0406 13:35:44.953199 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-9bdbb8fd8-r64xk_82a102f6-1fb6-4f30-8f0e-d3c4352b187e/manager/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.106756 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64744474b-lxkq7_2f59b367-b3b9-467b-b190-5492ec84d98c/manager/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.170717 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7594f57946-dlxrd_6d9a1f3c-f00e-498c-ae3f-3af6c407d051/manager/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.294797 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw_f9c22535-24c4-416f-98ef-fcd0299921c4/manager/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.514723 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-95748b946-k6fbn_a7adf6fa-28d9-4655-b720-606ab4b91117/operator/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.789180 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-s94ds_cd1445b3-61f1-4be0-aaf8-bee9a755cb7e/registry-server/0.log" Apr 06 13:35:45 crc kubenswrapper[4790]: I0406 13:35:45.951282 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-565fbbfdc9-msh7n_30dc85a8-d293-4324-b4af-f3b7731a5060/manager/0.log" Apr 06 13:35:46 crc kubenswrapper[4790]: I0406 13:35:46.095548 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-559d8fdb6b-9t6v2_96d39030-fa50-4568-a068-af079a592dc0/manager/0.log" Apr 06 13:35:46 crc kubenswrapper[4790]: I0406 13:35:46.306910 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-d46gz_bb09d720-75af-43a7-90dd-e497d2933183/operator/0.log" Apr 06 13:35:46 crc kubenswrapper[4790]: I0406 13:35:46.487125 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c4dd9cdf6-kjg4j_c9e8a19f-ad0a-45ed-a45f-240d2e5d187b/manager/0.log" Apr 06 13:35:46 crc kubenswrapper[4790]: I0406 13:35:46.859918 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56bf57d759-dq7bm_d4596119-7d30-4307-8815-4355fc5ee6eb/manager/0.log" Apr 06 13:35:46 crc kubenswrapper[4790]: I0406 13:35:46.958463 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5d8c8cd5bb-5w2d2_61665396-3382-4fa4-8d6a-706f47b2c5b0/manager/0.log" Apr 06 13:35:47 crc kubenswrapper[4790]: I0406 13:35:47.000703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-569745d4d8-ddglf_7a8fad23-b18e-4933-af57-3e06aee00225/manager/0.log" Apr 06 13:35:47 crc kubenswrapper[4790]: I0406 13:35:47.151453 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75df5978c-vvf85_56eefd7b-c275-40a3-8772-03ffc350736e/manager/0.log" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.138996 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591376-q22pj"] Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140092 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="extract-content" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140112 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="extract-content" Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140141 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="extract-utilities" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140151 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="extract-utilities" Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140176 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140184 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140207 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="extract-content" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140215 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="extract-content" Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140228 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140235 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: E0406 13:36:00.140251 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="extract-utilities" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140259 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="extract-utilities" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140503 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4158ed-b160-4908-9ee0-a97b76d70717" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.140518 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f445fe0-fb00-4a52-83a4-db9c7d660655" containerName="registry-server" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.141399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.144362 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.144864 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.145253 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.152140 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591376-q22pj"] Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.215957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gt2\" (UniqueName: \"kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2\") pod \"auto-csr-approver-29591376-q22pj\" (UID: \"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57\") " pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.317771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gt2\" (UniqueName: \"kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2\") pod \"auto-csr-approver-29591376-q22pj\" (UID: \"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57\") " pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.343566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gt2\" (UniqueName: \"kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2\") pod \"auto-csr-approver-29591376-q22pj\" (UID: \"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57\") " pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.457749 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:00 crc kubenswrapper[4790]: I0406 13:36:00.941170 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591376-q22pj"] Apr 06 13:36:01 crc kubenswrapper[4790]: I0406 13:36:01.074797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591376-q22pj" event={"ID":"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57","Type":"ContainerStarted","Data":"c653e363c0844286a41402584e33155f16bce03dd5424be115a78f33b9f24f55"} Apr 06 13:36:03 crc kubenswrapper[4790]: I0406 13:36:03.095546 4790 generic.go:334] "Generic (PLEG): container finished" podID="78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" containerID="9a1e93103f5b59a1cd5ecc606e7b5a17aa8449daa462d53bdb2f54ae8e227fdb" exitCode=0 Apr 06 13:36:03 crc kubenswrapper[4790]: I0406 13:36:03.095612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591376-q22pj" event={"ID":"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57","Type":"ContainerDied","Data":"9a1e93103f5b59a1cd5ecc606e7b5a17aa8449daa462d53bdb2f54ae8e227fdb"} Apr 06 13:36:04 crc kubenswrapper[4790]: I0406 13:36:04.521440 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:04 crc kubenswrapper[4790]: I0406 13:36:04.603794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5gt2\" (UniqueName: \"kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2\") pod \"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57\" (UID: \"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57\") " Apr 06 13:36:04 crc kubenswrapper[4790]: I0406 13:36:04.613443 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2" (OuterVolumeSpecName: "kube-api-access-c5gt2") pod "78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" (UID: "78b1aab1-61f0-40d2-bb49-c9bb1eea5e57"). InnerVolumeSpecName "kube-api-access-c5gt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:36:04 crc kubenswrapper[4790]: I0406 13:36:04.706316 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5gt2\" (UniqueName: \"kubernetes.io/projected/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57-kube-api-access-c5gt2\") on node \"crc\" DevicePath \"\"" Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.119434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591376-q22pj" event={"ID":"78b1aab1-61f0-40d2-bb49-c9bb1eea5e57","Type":"ContainerDied","Data":"c653e363c0844286a41402584e33155f16bce03dd5424be115a78f33b9f24f55"} Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.119481 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c653e363c0844286a41402584e33155f16bce03dd5424be115a78f33b9f24f55" Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.119541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591376-q22pj" Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.593485 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591370-9lb78"] Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.607799 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591370-9lb78"] Apr 06 13:36:05 crc kubenswrapper[4790]: I0406 13:36:05.686186 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c97babb-c360-4264-961a-015b42cf709e" path="/var/lib/kubelet/pods/8c97babb-c360-4264-961a-015b42cf709e/volumes" Apr 06 13:36:07 crc kubenswrapper[4790]: I0406 13:36:07.426484 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x74dz_35f420a5-ec2f-4d37-94ea-af000df33824/control-plane-machine-set-operator/0.log" Apr 06 13:36:07 crc kubenswrapper[4790]: I0406 13:36:07.637701 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fr5xt_5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e/machine-api-operator/0.log" Apr 06 13:36:07 crc kubenswrapper[4790]: I0406 13:36:07.654117 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fr5xt_5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e/kube-rbac-proxy/0.log" Apr 06 13:36:19 crc kubenswrapper[4790]: I0406 13:36:19.854877 4790 scope.go:117] "RemoveContainer" containerID="623dfdb792d4a3d02db65c8b80765322f29461a90bb6d3420936b95c22f52c69" Apr 06 13:36:20 crc kubenswrapper[4790]: I0406 13:36:20.990757 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jmqjq_7b2dea07-951f-4a31-ae96-5465449fbae8/cert-manager-controller/0.log" Apr 06 13:36:21 crc kubenswrapper[4790]: I0406 13:36:21.033897 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jmqjq_7b2dea07-951f-4a31-ae96-5465449fbae8/cert-manager-controller/1.log" Apr 06 13:36:21 crc kubenswrapper[4790]: I0406 13:36:21.237204 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n5t6m_ec02624d-3d5a-423d-818a-1422646a42a9/cert-manager-cainjector/1.log" Apr 06 13:36:21 crc kubenswrapper[4790]: I0406 13:36:21.295552 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n5t6m_ec02624d-3d5a-423d-818a-1422646a42a9/cert-manager-cainjector/0.log" Apr 06 13:36:21 crc kubenswrapper[4790]: I0406 13:36:21.394234 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-tm4jb_0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7/cert-manager-webhook/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.063239 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7b5ddc4dc7-hx2t5_34686f03-565c-4d7b-a0d5-f3b2d93e77dd/nmstate-console-plugin/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.214010 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rdg86_fb4a8135-7355-406b-a851-dce0109face5/nmstate-handler/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.274207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-khm89_baf52e27-af4b-4863-9e09-a3f11f497db9/kube-rbac-proxy/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.324559 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-khm89_baf52e27-af4b-4863-9e09-a3f11f497db9/nmstate-metrics/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.449464 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6b8c6447b-srf9k_8852468a-2987-493e-bdd3-1a0e0b3b0721/nmstate-operator/0.log" Apr 06 13:36:34 crc kubenswrapper[4790]: I0406 13:36:34.538076 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-5vn4v_8691bf62-154a-4c5b-8d00-066f07c030fa/nmstate-webhook/0.log" Apr 06 13:36:39 crc kubenswrapper[4790]: I0406 13:36:39.753959 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:36:39 crc kubenswrapper[4790]: I0406 13:36:39.754584 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:36:47 crc kubenswrapper[4790]: I0406 13:36:47.480222 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-86dff4bf76-bm9zd_ae41d528-6f4b-45e5-84f2-5d9eae998759/prometheus-operator/0.log" Apr 06 13:36:47 crc kubenswrapper[4790]: I0406 13:36:47.735553 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-pw54j_06f8ee69-3814-40ed-8ed2-5913509658de/prometheus-operator-admission-webhook/0.log" Apr 06 13:36:47 crc kubenswrapper[4790]: I0406 13:36:47.737599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw_ea4918c9-2a05-4c75-9d68-662e0a0fc175/prometheus-operator-admission-webhook/0.log" Apr 06 13:36:47 crc kubenswrapper[4790]: I0406 13:36:47.889061 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-dd944d769-7fscl_d990eb66-396e-4b05-acab-eaa30a6fbd34/operator/0.log" Apr 06 13:36:47 crc kubenswrapper[4790]: I0406 13:36:47.956023 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-74445bf4b8-spbpr_0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686/perses-operator/0.log" Apr 06 13:37:00 crc kubenswrapper[4790]: I0406 13:37:00.893907 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-lfmpj_fb497468-6169-47da-879b-96e49435e345/kube-rbac-proxy/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.005463 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-lfmpj_fb497468-6169-47da-879b-96e49435e345/controller/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.134605 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.248999 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.296586 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.304522 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.393895 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.559764 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.566257 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.587397 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.608914 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.737520 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.776964 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.804574 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.846114 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/controller/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.971694 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/frr-metrics/0.log" Apr 06 13:37:01 crc kubenswrapper[4790]: I0406 13:37:01.983196 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/kube-rbac-proxy/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.093456 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/kube-rbac-proxy-frr/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.207496 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/reloader/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.315584 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-j6z5c_7dd0b643-0c2a-467b-9267-caaba887289b/frr-k8s-webhook-server/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.424781 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cc74d7bc4-98drt_575f4b30-b909-40a3-aeee-d31e6b9238d5/manager/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.552246 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54f9f8cc7-p9gpk_d56b74e9-8981-420d-81d5-4b1a20286d52/webhook-server/0.log" Apr 06 13:37:02 crc kubenswrapper[4790]: I0406 13:37:02.743892 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zlz5n_5ca71ea5-d4c2-497d-945e-ac51b1fbf618/kube-rbac-proxy/0.log" Apr 06 13:37:03 crc kubenswrapper[4790]: I0406 13:37:03.292277 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zlz5n_5ca71ea5-d4c2-497d-945e-ac51b1fbf618/speaker/0.log" Apr 06 13:37:04 crc kubenswrapper[4790]: I0406 13:37:04.083622 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/frr/0.log" Apr 06 13:37:09 crc kubenswrapper[4790]: I0406 13:37:09.753876 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:37:09 crc kubenswrapper[4790]: I0406 13:37:09.754521 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:37:15 crc kubenswrapper[4790]: I0406 13:37:15.669820 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:37:15 crc kubenswrapper[4790]: I0406 13:37:15.807977 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:37:15 crc kubenswrapper[4790]: I0406 13:37:15.860469 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:37:15 crc kubenswrapper[4790]: I0406 13:37:15.865671 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.052498 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.081162 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.102713 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/extract/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.310744 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.452186 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.460225 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.498771 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.636883 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.651876 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/extract/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.656475 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.820997 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:37:16 crc kubenswrapper[4790]: I0406 13:37:16.985611 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.007846 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.042395 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.137152 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.162256 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.229584 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/extract/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.306691 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.449663 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.482006 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.498341 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.642557 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.696325 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:37:17 crc kubenswrapper[4790]: I0406 13:37:17.885071 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.117020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.130789 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.215360 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.367461 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.378155 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.574311 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.605331 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/registry-server/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.920376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:37:18 crc kubenswrapper[4790]: I0406 13:37:18.966511 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.000087 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.145026 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.193280 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.298812 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/extract/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.375993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/registry-server/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.454215 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.629750 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.650719 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.659994 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.778664 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.832385 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.865377 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/extract/0.log" Apr 06 13:37:19 crc kubenswrapper[4790]: I0406 13:37:19.947286 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vm6jd_bcbcee2e-3daf-4238-ac27-16f663c8b184/marketplace-operator/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.001289 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.180692 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.181170 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.187355 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.362567 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.438221 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.440653 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.644497 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.696365 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.705774 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/registry-server/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.749898 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.907998 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:37:20 crc kubenswrapper[4790]: I0406 13:37:20.940684 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:37:21 crc kubenswrapper[4790]: I0406 13:37:21.550679 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/registry-server/0.log" Apr 06 13:37:32 crc kubenswrapper[4790]: I0406 13:37:32.679339 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw_ea4918c9-2a05-4c75-9d68-662e0a0fc175/prometheus-operator-admission-webhook/0.log" Apr 06 13:37:32 crc kubenswrapper[4790]: I0406 13:37:32.772018 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-86dff4bf76-bm9zd_ae41d528-6f4b-45e5-84f2-5d9eae998759/prometheus-operator/0.log" Apr 06 13:37:32 crc kubenswrapper[4790]: I0406 13:37:32.774098 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-pw54j_06f8ee69-3814-40ed-8ed2-5913509658de/prometheus-operator-admission-webhook/0.log" Apr 06 13:37:32 crc kubenswrapper[4790]: I0406 13:37:32.919864 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-dd944d769-7fscl_d990eb66-396e-4b05-acab-eaa30a6fbd34/operator/0.log" Apr 06 13:37:32 crc kubenswrapper[4790]: I0406 13:37:32.956596 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-74445bf4b8-spbpr_0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686/perses-operator/0.log" Apr 06 13:37:39 crc kubenswrapper[4790]: I0406 13:37:39.752968 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:37:39 crc kubenswrapper[4790]: I0406 13:37:39.754633 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:37:39 crc kubenswrapper[4790]: I0406 13:37:39.754775 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:37:39 crc kubenswrapper[4790]: I0406 13:37:39.755781 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:37:39 crc kubenswrapper[4790]: I0406 13:37:39.755984 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" gracePeriod=600 Apr 06 13:37:39 crc kubenswrapper[4790]: E0406 13:37:39.904141 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:37:40 crc kubenswrapper[4790]: I0406 13:37:40.061849 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" exitCode=0 Apr 06 13:37:40 crc kubenswrapper[4790]: I0406 13:37:40.061873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149"} Apr 06 13:37:40 crc kubenswrapper[4790]: I0406 13:37:40.062334 4790 scope.go:117] "RemoveContainer" containerID="8954244d49e01bb4bfdeaf3a239df96426fbeefa509328f41d7ff8bc8b4b5460" Apr 06 13:37:40 crc kubenswrapper[4790]: I0406 13:37:40.063055 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:37:40 crc kubenswrapper[4790]: E0406 13:37:40.063338 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:37:52 crc kubenswrapper[4790]: I0406 13:37:52.675856 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:37:52 crc kubenswrapper[4790]: E0406 13:37:52.676643 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.154496 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591378-qdwd6"] Apr 06 13:38:00 crc kubenswrapper[4790]: E0406 13:38:00.155446 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" containerName="oc" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.155461 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" containerName="oc" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.155743 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" containerName="oc" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.156580 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.163669 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.164070 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.164281 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.168469 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591378-qdwd6"] Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.223110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sw7s\" (UniqueName: \"kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s\") pod \"auto-csr-approver-29591378-qdwd6\" (UID: \"99a82819-1618-4cd0-940a-39666ce35350\") " pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.325065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sw7s\" (UniqueName: \"kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s\") pod \"auto-csr-approver-29591378-qdwd6\" (UID: \"99a82819-1618-4cd0-940a-39666ce35350\") " pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.352039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sw7s\" (UniqueName: \"kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s\") pod \"auto-csr-approver-29591378-qdwd6\" (UID: \"99a82819-1618-4cd0-940a-39666ce35350\") " pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.482704 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:00 crc kubenswrapper[4790]: I0406 13:38:00.957791 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591378-qdwd6"] Apr 06 13:38:01 crc kubenswrapper[4790]: I0406 13:38:01.310440 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" event={"ID":"99a82819-1618-4cd0-940a-39666ce35350","Type":"ContainerStarted","Data":"c6512dea86e2ee2bbfe5f20c4b5d9ce3b6b3cbaf54720152214dd2d19468abe1"} Apr 06 13:38:02 crc kubenswrapper[4790]: I0406 13:38:02.320233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" event={"ID":"99a82819-1618-4cd0-940a-39666ce35350","Type":"ContainerStarted","Data":"90e2de3c744c434551fe298b6c4eda6c10785983079ec1a8d6d81f21b7a12a8c"} Apr 06 13:38:02 crc kubenswrapper[4790]: I0406 13:38:02.343919 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" podStartSLOduration=1.4571952449999999 podStartE2EDuration="2.343895323s" podCreationTimestamp="2026-04-06 13:38:00 +0000 UTC" firstStartedPulling="2026-04-06 13:38:00.964646321 +0000 UTC m=+6059.952389187" lastFinishedPulling="2026-04-06 13:38:01.851346399 +0000 UTC m=+6060.839089265" observedRunningTime="2026-04-06 13:38:02.337913183 +0000 UTC m=+6061.325656049" watchObservedRunningTime="2026-04-06 13:38:02.343895323 +0000 UTC m=+6061.331638199" Apr 06 13:38:03 crc kubenswrapper[4790]: I0406 13:38:03.330917 4790 generic.go:334] "Generic (PLEG): container finished" podID="99a82819-1618-4cd0-940a-39666ce35350" containerID="90e2de3c744c434551fe298b6c4eda6c10785983079ec1a8d6d81f21b7a12a8c" exitCode=0 Apr 06 13:38:03 crc kubenswrapper[4790]: I0406 13:38:03.331246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" event={"ID":"99a82819-1618-4cd0-940a-39666ce35350","Type":"ContainerDied","Data":"90e2de3c744c434551fe298b6c4eda6c10785983079ec1a8d6d81f21b7a12a8c"} Apr 06 13:38:04 crc kubenswrapper[4790]: I0406 13:38:04.718686 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:04 crc kubenswrapper[4790]: I0406 13:38:04.838597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sw7s\" (UniqueName: \"kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s\") pod \"99a82819-1618-4cd0-940a-39666ce35350\" (UID: \"99a82819-1618-4cd0-940a-39666ce35350\") " Apr 06 13:38:04 crc kubenswrapper[4790]: I0406 13:38:04.848069 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s" (OuterVolumeSpecName: "kube-api-access-5sw7s") pod "99a82819-1618-4cd0-940a-39666ce35350" (UID: "99a82819-1618-4cd0-940a-39666ce35350"). InnerVolumeSpecName "kube-api-access-5sw7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:38:04 crc kubenswrapper[4790]: I0406 13:38:04.941450 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sw7s\" (UniqueName: \"kubernetes.io/projected/99a82819-1618-4cd0-940a-39666ce35350-kube-api-access-5sw7s\") on node \"crc\" DevicePath \"\"" Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.350236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" event={"ID":"99a82819-1618-4cd0-940a-39666ce35350","Type":"ContainerDied","Data":"c6512dea86e2ee2bbfe5f20c4b5d9ce3b6b3cbaf54720152214dd2d19468abe1"} Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.350285 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6512dea86e2ee2bbfe5f20c4b5d9ce3b6b3cbaf54720152214dd2d19468abe1" Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.350286 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591378-qdwd6" Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.678762 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:38:05 crc kubenswrapper[4790]: E0406 13:38:05.681961 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.806532 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591372-xgz22"] Apr 06 13:38:05 crc kubenswrapper[4790]: I0406 13:38:05.816715 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591372-xgz22"] Apr 06 13:38:07 crc kubenswrapper[4790]: I0406 13:38:07.690514 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72" path="/var/lib/kubelet/pods/1f7c1f6f-7dc5-4ad9-9392-a3a10d257e72/volumes" Apr 06 13:38:19 crc kubenswrapper[4790]: I0406 13:38:19.963155 4790 scope.go:117] "RemoveContainer" containerID="f398cc8a4a728a6a75abc400cf9865ee65b7670542e2f84a48e63582723f6ad4" Apr 06 13:38:20 crc kubenswrapper[4790]: I0406 13:38:20.676423 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:38:20 crc kubenswrapper[4790]: E0406 13:38:20.676966 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:38:33 crc kubenswrapper[4790]: I0406 13:38:33.675619 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:38:33 crc kubenswrapper[4790]: E0406 13:38:33.676762 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:38:44 crc kubenswrapper[4790]: I0406 13:38:44.676293 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:38:44 crc kubenswrapper[4790]: E0406 13:38:44.677074 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:38:56 crc kubenswrapper[4790]: I0406 13:38:56.675167 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:38:56 crc kubenswrapper[4790]: E0406 13:38:56.675821 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:39:09 crc kubenswrapper[4790]: I0406 13:39:09.675789 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:39:09 crc kubenswrapper[4790]: E0406 13:39:09.676522 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:39:21 crc kubenswrapper[4790]: I0406 13:39:21.688787 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:39:21 crc kubenswrapper[4790]: E0406 13:39:21.689862 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:39:35 crc kubenswrapper[4790]: I0406 13:39:35.676061 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:39:35 crc kubenswrapper[4790]: E0406 13:39:35.676888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:39:42 crc kubenswrapper[4790]: I0406 13:39:42.403880 4790 generic.go:334] "Generic (PLEG): container finished" podID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerID="0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd" exitCode=0 Apr 06 13:39:42 crc kubenswrapper[4790]: I0406 13:39:42.404002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zks8r/must-gather-78bwh" event={"ID":"3a74e336-4367-4e9a-ac78-f69716c33ae0","Type":"ContainerDied","Data":"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd"} Apr 06 13:39:42 crc kubenswrapper[4790]: I0406 13:39:42.405154 4790 scope.go:117] "RemoveContainer" containerID="0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd" Apr 06 13:39:42 crc kubenswrapper[4790]: I0406 13:39:42.948767 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zks8r_must-gather-78bwh_3a74e336-4367-4e9a-ac78-f69716c33ae0/gather/0.log" Apr 06 13:39:50 crc kubenswrapper[4790]: I0406 13:39:50.676359 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:39:50 crc kubenswrapper[4790]: E0406 13:39:50.677977 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:39:51 crc kubenswrapper[4790]: I0406 13:39:51.898002 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zks8r/must-gather-78bwh"] Apr 06 13:39:51 crc kubenswrapper[4790]: I0406 13:39:51.898242 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zks8r/must-gather-78bwh" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="copy" containerID="cri-o://9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8" gracePeriod=2 Apr 06 13:39:51 crc kubenswrapper[4790]: I0406 13:39:51.913030 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zks8r/must-gather-78bwh"] Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.353409 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zks8r_must-gather-78bwh_3a74e336-4367-4e9a-ac78-f69716c33ae0/copy/0.log" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.354459 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.434631 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output\") pod \"3a74e336-4367-4e9a-ac78-f69716c33ae0\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.434865 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkzjf\" (UniqueName: \"kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf\") pod \"3a74e336-4367-4e9a-ac78-f69716c33ae0\" (UID: \"3a74e336-4367-4e9a-ac78-f69716c33ae0\") " Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.453019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf" (OuterVolumeSpecName: "kube-api-access-qkzjf") pod "3a74e336-4367-4e9a-ac78-f69716c33ae0" (UID: "3a74e336-4367-4e9a-ac78-f69716c33ae0"). InnerVolumeSpecName "kube-api-access-qkzjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.535397 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zks8r_must-gather-78bwh_3a74e336-4367-4e9a-ac78-f69716c33ae0/copy/0.log" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.535806 4790 generic.go:334] "Generic (PLEG): container finished" podID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerID="9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8" exitCode=143 Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.535876 4790 scope.go:117] "RemoveContainer" containerID="9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.536024 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zks8r/must-gather-78bwh" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.544316 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkzjf\" (UniqueName: \"kubernetes.io/projected/3a74e336-4367-4e9a-ac78-f69716c33ae0-kube-api-access-qkzjf\") on node \"crc\" DevicePath \"\"" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.576173 4790 scope.go:117] "RemoveContainer" containerID="0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.647978 4790 scope.go:117] "RemoveContainer" containerID="9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8" Apr 06 13:39:52 crc kubenswrapper[4790]: E0406 13:39:52.648448 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8\": container with ID starting with 9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8 not found: ID does not exist" containerID="9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.648496 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8"} err="failed to get container status \"9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8\": rpc error: code = NotFound desc = could not find container \"9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8\": container with ID starting with 9420b74553a8fb668a561594fba02e0384d054b198843336c5adcf838ff617f8 not found: ID does not exist" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.648524 4790 scope.go:117] "RemoveContainer" containerID="0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd" Apr 06 13:39:52 crc kubenswrapper[4790]: E0406 13:39:52.648906 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd\": container with ID starting with 0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd not found: ID does not exist" containerID="0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.648983 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd"} err="failed to get container status \"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd\": rpc error: code = NotFound desc = could not find container \"0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd\": container with ID starting with 0746c60c7eff69d7541a55c09ab3960b393d8fa7a9b1dcc2e06f992c712fb7dd not found: ID does not exist" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.674428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3a74e336-4367-4e9a-ac78-f69716c33ae0" (UID: "3a74e336-4367-4e9a-ac78-f69716c33ae0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:39:52 crc kubenswrapper[4790]: I0406 13:39:52.748327 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a74e336-4367-4e9a-ac78-f69716c33ae0-must-gather-output\") on node \"crc\" DevicePath \"\"" Apr 06 13:39:53 crc kubenswrapper[4790]: I0406 13:39:53.687411 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" path="/var/lib/kubelet/pods/3a74e336-4367-4e9a-ac78-f69716c33ae0/volumes" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.143727 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591380-wpppm"] Apr 06 13:40:00 crc kubenswrapper[4790]: E0406 13:40:00.144913 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="gather" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.144933 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="gather" Apr 06 13:40:00 crc kubenswrapper[4790]: E0406 13:40:00.144952 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a82819-1618-4cd0-940a-39666ce35350" containerName="oc" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.144962 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a82819-1618-4cd0-940a-39666ce35350" containerName="oc" Apr 06 13:40:00 crc kubenswrapper[4790]: E0406 13:40:00.145032 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="copy" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.145044 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="copy" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.145370 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a82819-1618-4cd0-940a-39666ce35350" containerName="oc" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.145404 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="gather" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.145426 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a74e336-4367-4e9a-ac78-f69716c33ae0" containerName="copy" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.146493 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.149402 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.149624 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.150095 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.155929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591380-wpppm"] Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.205060 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcclc\" (UniqueName: \"kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc\") pod \"auto-csr-approver-29591380-wpppm\" (UID: \"1571e9f1-49ec-4438-94a5-ffb24d70f4c1\") " pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.307142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcclc\" (UniqueName: \"kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc\") pod \"auto-csr-approver-29591380-wpppm\" (UID: \"1571e9f1-49ec-4438-94a5-ffb24d70f4c1\") " pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.324261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcclc\" (UniqueName: \"kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc\") pod \"auto-csr-approver-29591380-wpppm\" (UID: \"1571e9f1-49ec-4438-94a5-ffb24d70f4c1\") " pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.471860 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.970798 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:40:00 crc kubenswrapper[4790]: I0406 13:40:00.976064 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591380-wpppm"] Apr 06 13:40:01 crc kubenswrapper[4790]: I0406 13:40:01.641520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591380-wpppm" event={"ID":"1571e9f1-49ec-4438-94a5-ffb24d70f4c1","Type":"ContainerStarted","Data":"810e469a705e984f099420d02cec59bb7b2bcc2f1e1b53458f0d44a36ba13107"} Apr 06 13:40:02 crc kubenswrapper[4790]: I0406 13:40:02.651028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591380-wpppm" event={"ID":"1571e9f1-49ec-4438-94a5-ffb24d70f4c1","Type":"ContainerStarted","Data":"ef194237e8b2aab1be9c94e4137c40610eef2501666a5ded2f9a8217cae99076"} Apr 06 13:40:02 crc kubenswrapper[4790]: I0406 13:40:02.668800 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591380-wpppm" podStartSLOduration=1.442086018 podStartE2EDuration="2.668781802s" podCreationTimestamp="2026-04-06 13:40:00 +0000 UTC" firstStartedPulling="2026-04-06 13:40:00.970380612 +0000 UTC m=+6179.958123488" lastFinishedPulling="2026-04-06 13:40:02.197076406 +0000 UTC m=+6181.184819272" observedRunningTime="2026-04-06 13:40:02.661665332 +0000 UTC m=+6181.649408198" watchObservedRunningTime="2026-04-06 13:40:02.668781802 +0000 UTC m=+6181.656524668" Apr 06 13:40:03 crc kubenswrapper[4790]: I0406 13:40:03.660630 4790 generic.go:334] "Generic (PLEG): container finished" podID="1571e9f1-49ec-4438-94a5-ffb24d70f4c1" containerID="ef194237e8b2aab1be9c94e4137c40610eef2501666a5ded2f9a8217cae99076" exitCode=0 Apr 06 13:40:03 crc kubenswrapper[4790]: I0406 13:40:03.660715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591380-wpppm" event={"ID":"1571e9f1-49ec-4438-94a5-ffb24d70f4c1","Type":"ContainerDied","Data":"ef194237e8b2aab1be9c94e4137c40610eef2501666a5ded2f9a8217cae99076"} Apr 06 13:40:03 crc kubenswrapper[4790]: I0406 13:40:03.676162 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:40:03 crc kubenswrapper[4790]: E0406 13:40:03.676654 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.022551 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.108620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcclc\" (UniqueName: \"kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc\") pod \"1571e9f1-49ec-4438-94a5-ffb24d70f4c1\" (UID: \"1571e9f1-49ec-4438-94a5-ffb24d70f4c1\") " Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.116041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc" (OuterVolumeSpecName: "kube-api-access-zcclc") pod "1571e9f1-49ec-4438-94a5-ffb24d70f4c1" (UID: "1571e9f1-49ec-4438-94a5-ffb24d70f4c1"). InnerVolumeSpecName "kube-api-access-zcclc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.211108 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcclc\" (UniqueName: \"kubernetes.io/projected/1571e9f1-49ec-4438-94a5-ffb24d70f4c1-kube-api-access-zcclc\") on node \"crc\" DevicePath \"\"" Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.681531 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591380-wpppm" Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.700978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591380-wpppm" event={"ID":"1571e9f1-49ec-4438-94a5-ffb24d70f4c1","Type":"ContainerDied","Data":"810e469a705e984f099420d02cec59bb7b2bcc2f1e1b53458f0d44a36ba13107"} Apr 06 13:40:05 crc kubenswrapper[4790]: I0406 13:40:05.701122 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="810e469a705e984f099420d02cec59bb7b2bcc2f1e1b53458f0d44a36ba13107" Apr 06 13:40:06 crc kubenswrapper[4790]: I0406 13:40:06.092551 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591374-25knm"] Apr 06 13:40:06 crc kubenswrapper[4790]: I0406 13:40:06.106734 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591374-25knm"] Apr 06 13:40:07 crc kubenswrapper[4790]: I0406 13:40:07.686048 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8916c697-9830-4c0d-9faf-5d5f9195d5b1" path="/var/lib/kubelet/pods/8916c697-9830-4c0d-9faf-5d5f9195d5b1/volumes" Apr 06 13:40:17 crc kubenswrapper[4790]: I0406 13:40:17.676784 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:40:17 crc kubenswrapper[4790]: E0406 13:40:17.677806 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:40:20 crc kubenswrapper[4790]: I0406 13:40:20.073867 4790 scope.go:117] "RemoveContainer" containerID="c2ebaf01f6c99e530c4383ca9297587ce3794532cc9cfac0346bb0276e1a0700" Apr 06 13:40:20 crc kubenswrapper[4790]: I0406 13:40:20.098950 4790 scope.go:117] "RemoveContainer" containerID="95e9b1685219b458c1bb9d787a1aa88cda6a3816fa63a3fe26cdb21750f367d8" Apr 06 13:40:20 crc kubenswrapper[4790]: I0406 13:40:20.181817 4790 scope.go:117] "RemoveContainer" containerID="07cddb31488b04e7875619ad5b2ebf5f67fd1da3b2750a89b7e8ed50a94444f5" Apr 06 13:40:31 crc kubenswrapper[4790]: I0406 13:40:31.682498 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:40:31 crc kubenswrapper[4790]: E0406 13:40:31.683451 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:40:46 crc kubenswrapper[4790]: I0406 13:40:46.675634 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:40:46 crc kubenswrapper[4790]: E0406 13:40:46.676312 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:40:57 crc kubenswrapper[4790]: I0406 13:40:57.675687 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:40:57 crc kubenswrapper[4790]: E0406 13:40:57.676399 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:41:11 crc kubenswrapper[4790]: I0406 13:41:11.691632 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:41:11 crc kubenswrapper[4790]: E0406 13:41:11.693084 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.837732 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:12 crc kubenswrapper[4790]: E0406 13:41:12.838520 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1571e9f1-49ec-4438-94a5-ffb24d70f4c1" containerName="oc" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.838532 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1571e9f1-49ec-4438-94a5-ffb24d70f4c1" containerName="oc" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.838751 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1571e9f1-49ec-4438-94a5-ffb24d70f4c1" containerName="oc" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.840233 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.864289 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.871476 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznzg\" (UniqueName: \"kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.871577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.871700 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.973540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.973662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznzg\" (UniqueName: \"kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.973712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.974143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.974142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:12 crc kubenswrapper[4790]: I0406 13:41:12.998415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznzg\" (UniqueName: \"kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg\") pod \"redhat-operators-n9blz\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:13 crc kubenswrapper[4790]: I0406 13:41:13.165311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:13 crc kubenswrapper[4790]: I0406 13:41:13.644065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:14 crc kubenswrapper[4790]: I0406 13:41:14.359132 4790 generic.go:334] "Generic (PLEG): container finished" podID="b617d6ed-9b41-4c44-8866-258246af1211" containerID="97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14" exitCode=0 Apr 06 13:41:14 crc kubenswrapper[4790]: I0406 13:41:14.359179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerDied","Data":"97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14"} Apr 06 13:41:14 crc kubenswrapper[4790]: I0406 13:41:14.359226 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerStarted","Data":"8dcdfe6c596cce0a3ec6d479dee4151714809c1e03f611b3c0f873d2ad461b3e"} Apr 06 13:41:16 crc kubenswrapper[4790]: I0406 13:41:16.381472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerStarted","Data":"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa"} Apr 06 13:41:20 crc kubenswrapper[4790]: I0406 13:41:20.420631 4790 generic.go:334] "Generic (PLEG): container finished" podID="b617d6ed-9b41-4c44-8866-258246af1211" containerID="9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa" exitCode=0 Apr 06 13:41:20 crc kubenswrapper[4790]: I0406 13:41:20.421167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerDied","Data":"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa"} Apr 06 13:41:22 crc kubenswrapper[4790]: I0406 13:41:22.451557 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerStarted","Data":"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0"} Apr 06 13:41:22 crc kubenswrapper[4790]: I0406 13:41:22.475282 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9blz" podStartSLOduration=3.583909875 podStartE2EDuration="10.475263181s" podCreationTimestamp="2026-04-06 13:41:12 +0000 UTC" firstStartedPulling="2026-04-06 13:41:14.367062953 +0000 UTC m=+6253.354805819" lastFinishedPulling="2026-04-06 13:41:21.258416239 +0000 UTC m=+6260.246159125" observedRunningTime="2026-04-06 13:41:22.474228973 +0000 UTC m=+6261.461971859" watchObservedRunningTime="2026-04-06 13:41:22.475263181 +0000 UTC m=+6261.463006047" Apr 06 13:41:23 crc kubenswrapper[4790]: I0406 13:41:23.165584 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:23 crc kubenswrapper[4790]: I0406 13:41:23.165942 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:24 crc kubenswrapper[4790]: I0406 13:41:24.222484 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n9blz" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="registry-server" probeResult="failure" output=< Apr 06 13:41:24 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:41:24 crc kubenswrapper[4790]: > Apr 06 13:41:26 crc kubenswrapper[4790]: I0406 13:41:26.675670 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:41:26 crc kubenswrapper[4790]: E0406 13:41:26.676341 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:41:33 crc kubenswrapper[4790]: I0406 13:41:33.237841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:33 crc kubenswrapper[4790]: I0406 13:41:33.295454 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:33 crc kubenswrapper[4790]: I0406 13:41:33.484430 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:34 crc kubenswrapper[4790]: I0406 13:41:34.559557 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9blz" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="registry-server" containerID="cri-o://13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0" gracePeriod=2 Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.044287 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.148554 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities\") pod \"b617d6ed-9b41-4c44-8866-258246af1211\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.148775 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznzg\" (UniqueName: \"kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg\") pod \"b617d6ed-9b41-4c44-8866-258246af1211\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.148803 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content\") pod \"b617d6ed-9b41-4c44-8866-258246af1211\" (UID: \"b617d6ed-9b41-4c44-8866-258246af1211\") " Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.149748 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities" (OuterVolumeSpecName: "utilities") pod "b617d6ed-9b41-4c44-8866-258246af1211" (UID: "b617d6ed-9b41-4c44-8866-258246af1211"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.159126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg" (OuterVolumeSpecName: "kube-api-access-gznzg") pod "b617d6ed-9b41-4c44-8866-258246af1211" (UID: "b617d6ed-9b41-4c44-8866-258246af1211"). InnerVolumeSpecName "kube-api-access-gznzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.251157 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznzg\" (UniqueName: \"kubernetes.io/projected/b617d6ed-9b41-4c44-8866-258246af1211-kube-api-access-gznzg\") on node \"crc\" DevicePath \"\"" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.251197 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.312166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b617d6ed-9b41-4c44-8866-258246af1211" (UID: "b617d6ed-9b41-4c44-8866-258246af1211"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.352801 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b617d6ed-9b41-4c44-8866-258246af1211-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.573590 4790 generic.go:334] "Generic (PLEG): container finished" podID="b617d6ed-9b41-4c44-8866-258246af1211" containerID="13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0" exitCode=0 Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.573673 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9blz" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.573671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerDied","Data":"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0"} Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.573846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9blz" event={"ID":"b617d6ed-9b41-4c44-8866-258246af1211","Type":"ContainerDied","Data":"8dcdfe6c596cce0a3ec6d479dee4151714809c1e03f611b3c0f873d2ad461b3e"} Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.573861 4790 scope.go:117] "RemoveContainer" containerID="13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.601847 4790 scope.go:117] "RemoveContainer" containerID="9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.637690 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.648318 4790 scope.go:117] "RemoveContainer" containerID="97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.651927 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9blz"] Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.694271 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b617d6ed-9b41-4c44-8866-258246af1211" path="/var/lib/kubelet/pods/b617d6ed-9b41-4c44-8866-258246af1211/volumes" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.698006 4790 scope.go:117] "RemoveContainer" containerID="13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0" Apr 06 13:41:35 crc kubenswrapper[4790]: E0406 13:41:35.698644 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0\": container with ID starting with 13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0 not found: ID does not exist" containerID="13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.698691 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0"} err="failed to get container status \"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0\": rpc error: code = NotFound desc = could not find container \"13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0\": container with ID starting with 13a2513a598e1d28a129ed9575f21ad749d11357833b2685b60d92a6669c68d0 not found: ID does not exist" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.698723 4790 scope.go:117] "RemoveContainer" containerID="9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa" Apr 06 13:41:35 crc kubenswrapper[4790]: E0406 13:41:35.699054 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa\": container with ID starting with 9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa not found: ID does not exist" containerID="9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.699228 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa"} err="failed to get container status \"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa\": rpc error: code = NotFound desc = could not find container \"9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa\": container with ID starting with 9b4badef590fa5bc91c29e68cfa88e8e9eeb01310d400531b283bba90d1e50aa not found: ID does not exist" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.699258 4790 scope.go:117] "RemoveContainer" containerID="97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14" Apr 06 13:41:35 crc kubenswrapper[4790]: E0406 13:41:35.699538 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14\": container with ID starting with 97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14 not found: ID does not exist" containerID="97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14" Apr 06 13:41:35 crc kubenswrapper[4790]: I0406 13:41:35.699569 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14"} err="failed to get container status \"97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14\": rpc error: code = NotFound desc = could not find container \"97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14\": container with ID starting with 97bb740de4b8777e044991d719d2a45371b011ba7f6e39d5a3ddc9306e5fbb14 not found: ID does not exist" Apr 06 13:41:37 crc kubenswrapper[4790]: I0406 13:41:37.676047 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:41:37 crc kubenswrapper[4790]: E0406 13:41:37.676598 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:41:48 crc kubenswrapper[4790]: I0406 13:41:48.675258 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:41:48 crc kubenswrapper[4790]: E0406 13:41:48.676951 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.149718 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591382-kmgwd"] Apr 06 13:42:00 crc kubenswrapper[4790]: E0406 13:42:00.151068 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="extract-utilities" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.151092 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="extract-utilities" Apr 06 13:42:00 crc kubenswrapper[4790]: E0406 13:42:00.151118 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="extract-content" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.151130 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="extract-content" Apr 06 13:42:00 crc kubenswrapper[4790]: E0406 13:42:00.151164 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="registry-server" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.151178 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="registry-server" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.151544 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b617d6ed-9b41-4c44-8866-258246af1211" containerName="registry-server" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.152446 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.156045 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.156140 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.156775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.161177 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591382-kmgwd"] Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.283976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927vj\" (UniqueName: \"kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj\") pod \"auto-csr-approver-29591382-kmgwd\" (UID: \"c89e294a-9374-4837-8564-b56032dbc423\") " pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.385421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-927vj\" (UniqueName: \"kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj\") pod \"auto-csr-approver-29591382-kmgwd\" (UID: \"c89e294a-9374-4837-8564-b56032dbc423\") " pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.410444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-927vj\" (UniqueName: \"kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj\") pod \"auto-csr-approver-29591382-kmgwd\" (UID: \"c89e294a-9374-4837-8564-b56032dbc423\") " pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.485014 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:00 crc kubenswrapper[4790]: I0406 13:42:00.954945 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591382-kmgwd"] Apr 06 13:42:01 crc kubenswrapper[4790]: I0406 13:42:01.693210 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:42:01 crc kubenswrapper[4790]: E0406 13:42:01.694229 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:42:01 crc kubenswrapper[4790]: I0406 13:42:01.861133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" event={"ID":"c89e294a-9374-4837-8564-b56032dbc423","Type":"ContainerStarted","Data":"3de6e1d12bf1786930f9845ff0e65cadb6892dfdebe683b34af9a6cc0da99232"} Apr 06 13:42:02 crc kubenswrapper[4790]: I0406 13:42:02.877308 4790 generic.go:334] "Generic (PLEG): container finished" podID="c89e294a-9374-4837-8564-b56032dbc423" containerID="2d3d22a27ecbefc7841340e86e2004e5f3e03f681a134bdbb8cd4ffa11448399" exitCode=0 Apr 06 13:42:02 crc kubenswrapper[4790]: I0406 13:42:02.877381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" event={"ID":"c89e294a-9374-4837-8564-b56032dbc423","Type":"ContainerDied","Data":"2d3d22a27ecbefc7841340e86e2004e5f3e03f681a134bdbb8cd4ffa11448399"} Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.224105 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.398506 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-927vj\" (UniqueName: \"kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj\") pod \"c89e294a-9374-4837-8564-b56032dbc423\" (UID: \"c89e294a-9374-4837-8564-b56032dbc423\") " Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.407712 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj" (OuterVolumeSpecName: "kube-api-access-927vj") pod "c89e294a-9374-4837-8564-b56032dbc423" (UID: "c89e294a-9374-4837-8564-b56032dbc423"). InnerVolumeSpecName "kube-api-access-927vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.501252 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-927vj\" (UniqueName: \"kubernetes.io/projected/c89e294a-9374-4837-8564-b56032dbc423-kube-api-access-927vj\") on node \"crc\" DevicePath \"\"" Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.898406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" event={"ID":"c89e294a-9374-4837-8564-b56032dbc423","Type":"ContainerDied","Data":"3de6e1d12bf1786930f9845ff0e65cadb6892dfdebe683b34af9a6cc0da99232"} Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.898449 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3de6e1d12bf1786930f9845ff0e65cadb6892dfdebe683b34af9a6cc0da99232" Apr 06 13:42:04 crc kubenswrapper[4790]: I0406 13:42:04.898773 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591382-kmgwd" Apr 06 13:42:05 crc kubenswrapper[4790]: I0406 13:42:05.294911 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591376-q22pj"] Apr 06 13:42:05 crc kubenswrapper[4790]: I0406 13:42:05.305025 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591376-q22pj"] Apr 06 13:42:05 crc kubenswrapper[4790]: I0406 13:42:05.687808 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b1aab1-61f0-40d2-bb49-c9bb1eea5e57" path="/var/lib/kubelet/pods/78b1aab1-61f0-40d2-bb49-c9bb1eea5e57/volumes" Apr 06 13:42:15 crc kubenswrapper[4790]: I0406 13:42:15.676200 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:42:15 crc kubenswrapper[4790]: E0406 13:42:15.677363 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:42:20 crc kubenswrapper[4790]: I0406 13:42:20.330644 4790 scope.go:117] "RemoveContainer" containerID="9a1e93103f5b59a1cd5ecc606e7b5a17aa8449daa462d53bdb2f54ae8e227fdb" Apr 06 13:42:29 crc kubenswrapper[4790]: I0406 13:42:29.676541 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:42:29 crc kubenswrapper[4790]: E0406 13:42:29.677063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:42:44 crc kubenswrapper[4790]: I0406 13:42:44.676322 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:42:45 crc kubenswrapper[4790]: I0406 13:42:45.319277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1"} Apr 06 13:43:19 crc kubenswrapper[4790]: I0406 13:43:19.999160 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m5z6z/must-gather-q8hkq"] Apr 06 13:43:20 crc kubenswrapper[4790]: E0406 13:43:20.000006 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89e294a-9374-4837-8564-b56032dbc423" containerName="oc" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.000017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89e294a-9374-4837-8564-b56032dbc423" containerName="oc" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.000198 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89e294a-9374-4837-8564-b56032dbc423" containerName="oc" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.001192 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.005031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-m5z6z"/"default-dockercfg-ldk4d" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.005555 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m5z6z"/"openshift-service-ca.crt" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.005741 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-m5z6z"/"kube-root-ca.crt" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.031854 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m5z6z/must-gather-q8hkq"] Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.160284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.160403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4grv\" (UniqueName: \"kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.265463 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4grv\" (UniqueName: \"kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.265602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.266070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.343573 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4grv\" (UniqueName: \"kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv\") pod \"must-gather-q8hkq\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:20 crc kubenswrapper[4790]: I0406 13:43:20.619518 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:43:21 crc kubenswrapper[4790]: I0406 13:43:21.102325 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m5z6z/must-gather-q8hkq"] Apr 06 13:43:21 crc kubenswrapper[4790]: I0406 13:43:21.673143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" event={"ID":"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc","Type":"ContainerStarted","Data":"e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b"} Apr 06 13:43:21 crc kubenswrapper[4790]: I0406 13:43:21.673548 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" event={"ID":"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc","Type":"ContainerStarted","Data":"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0"} Apr 06 13:43:21 crc kubenswrapper[4790]: I0406 13:43:21.673567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" event={"ID":"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc","Type":"ContainerStarted","Data":"0cdc6c321587e29b9890e5cfc680b3a95a0b38c33ac84f7922c6ef2f6a7c4c4f"} Apr 06 13:43:21 crc kubenswrapper[4790]: I0406 13:43:21.693749 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" podStartSLOduration=2.693733623 podStartE2EDuration="2.693733623s" podCreationTimestamp="2026-04-06 13:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 13:43:21.688182555 +0000 UTC m=+6380.675925451" watchObservedRunningTime="2026-04-06 13:43:21.693733623 +0000 UTC m=+6380.681476489" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.496441 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-dp94d"] Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.498715 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.579648 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.579801 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwhp\" (UniqueName: \"kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.681297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwhp\" (UniqueName: \"kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.681460 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.681594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.706698 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwhp\" (UniqueName: \"kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp\") pod \"crc-debug-dp94d\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: I0406 13:43:25.815222 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:43:25 crc kubenswrapper[4790]: W0406 13:43:25.855447 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c2cd420_059d_4dd6_864d_ed13e6715244.slice/crio-8bc122b1b9d2bc9b0c2c0f5b7d4cc0c6731d0c531be784598ca3b44cc4995609 WatchSource:0}: Error finding container 8bc122b1b9d2bc9b0c2c0f5b7d4cc0c6731d0c531be784598ca3b44cc4995609: Status 404 returned error can't find the container with id 8bc122b1b9d2bc9b0c2c0f5b7d4cc0c6731d0c531be784598ca3b44cc4995609 Apr 06 13:43:26 crc kubenswrapper[4790]: I0406 13:43:26.720923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" event={"ID":"9c2cd420-059d-4dd6-864d-ed13e6715244","Type":"ContainerStarted","Data":"0a6215a2722cec7fae0bd2cc36646cdca8d6147bc7027d5e6f9a92338dfafbb8"} Apr 06 13:43:26 crc kubenswrapper[4790]: I0406 13:43:26.721466 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" event={"ID":"9c2cd420-059d-4dd6-864d-ed13e6715244","Type":"ContainerStarted","Data":"8bc122b1b9d2bc9b0c2c0f5b7d4cc0c6731d0c531be784598ca3b44cc4995609"} Apr 06 13:43:26 crc kubenswrapper[4790]: I0406 13:43:26.740760 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" podStartSLOduration=1.7407424379999998 podStartE2EDuration="1.740742438s" podCreationTimestamp="2026-04-06 13:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-06 13:43:26.735043945 +0000 UTC m=+6385.722786811" watchObservedRunningTime="2026-04-06 13:43:26.740742438 +0000 UTC m=+6385.728485294" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.141153 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591384-x8m2t"] Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.142810 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.150200 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.150218 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.150320 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.156676 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591384-x8m2t"] Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.227640 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwsr\" (UniqueName: \"kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr\") pod \"auto-csr-approver-29591384-x8m2t\" (UID: \"25452c01-bc42-4b7a-82c4-ea776612c95d\") " pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.329674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwsr\" (UniqueName: \"kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr\") pod \"auto-csr-approver-29591384-x8m2t\" (UID: \"25452c01-bc42-4b7a-82c4-ea776612c95d\") " pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.348286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwsr\" (UniqueName: \"kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr\") pod \"auto-csr-approver-29591384-x8m2t\" (UID: \"25452c01-bc42-4b7a-82c4-ea776612c95d\") " pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:00 crc kubenswrapper[4790]: I0406 13:44:00.474318 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:01 crc kubenswrapper[4790]: I0406 13:44:01.113536 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591384-x8m2t"] Apr 06 13:44:02 crc kubenswrapper[4790]: I0406 13:44:02.046928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" event={"ID":"25452c01-bc42-4b7a-82c4-ea776612c95d","Type":"ContainerStarted","Data":"b850bbf54ac609fa2812ed6ce645e69bf89b86d29464ab87215dfc0596ff172e"} Apr 06 13:44:03 crc kubenswrapper[4790]: I0406 13:44:03.056969 4790 generic.go:334] "Generic (PLEG): container finished" podID="25452c01-bc42-4b7a-82c4-ea776612c95d" containerID="3289cc6977c7de004358d579c1512438c72597bdc9452be6849f9c6c0de5daeb" exitCode=0 Apr 06 13:44:03 crc kubenswrapper[4790]: I0406 13:44:03.057058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" event={"ID":"25452c01-bc42-4b7a-82c4-ea776612c95d","Type":"ContainerDied","Data":"3289cc6977c7de004358d579c1512438c72597bdc9452be6849f9c6c0de5daeb"} Apr 06 13:44:04 crc kubenswrapper[4790]: I0406 13:44:04.501142 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:04 crc kubenswrapper[4790]: I0406 13:44:04.584112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrwsr\" (UniqueName: \"kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr\") pod \"25452c01-bc42-4b7a-82c4-ea776612c95d\" (UID: \"25452c01-bc42-4b7a-82c4-ea776612c95d\") " Apr 06 13:44:04 crc kubenswrapper[4790]: I0406 13:44:04.590616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr" (OuterVolumeSpecName: "kube-api-access-hrwsr") pod "25452c01-bc42-4b7a-82c4-ea776612c95d" (UID: "25452c01-bc42-4b7a-82c4-ea776612c95d"). InnerVolumeSpecName "kube-api-access-hrwsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:44:04 crc kubenswrapper[4790]: I0406 13:44:04.686891 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrwsr\" (UniqueName: \"kubernetes.io/projected/25452c01-bc42-4b7a-82c4-ea776612c95d-kube-api-access-hrwsr\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.076770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" event={"ID":"25452c01-bc42-4b7a-82c4-ea776612c95d","Type":"ContainerDied","Data":"b850bbf54ac609fa2812ed6ce645e69bf89b86d29464ab87215dfc0596ff172e"} Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.077102 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b850bbf54ac609fa2812ed6ce645e69bf89b86d29464ab87215dfc0596ff172e" Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.076811 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591384-x8m2t" Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.569713 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591378-qdwd6"] Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.589103 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591378-qdwd6"] Apr 06 13:44:05 crc kubenswrapper[4790]: I0406 13:44:05.687268 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a82819-1618-4cd0-940a-39666ce35350" path="/var/lib/kubelet/pods/99a82819-1618-4cd0-940a-39666ce35350/volumes" Apr 06 13:44:07 crc kubenswrapper[4790]: I0406 13:44:07.098409 4790 generic.go:334] "Generic (PLEG): container finished" podID="9c2cd420-059d-4dd6-864d-ed13e6715244" containerID="0a6215a2722cec7fae0bd2cc36646cdca8d6147bc7027d5e6f9a92338dfafbb8" exitCode=0 Apr 06 13:44:07 crc kubenswrapper[4790]: I0406 13:44:07.098493 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" event={"ID":"9c2cd420-059d-4dd6-864d-ed13e6715244","Type":"ContainerDied","Data":"0a6215a2722cec7fae0bd2cc36646cdca8d6147bc7027d5e6f9a92338dfafbb8"} Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.224809 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.275986 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-dp94d"] Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.291682 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-dp94d"] Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.360028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kwhp\" (UniqueName: \"kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp\") pod \"9c2cd420-059d-4dd6-864d-ed13e6715244\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.360299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host\") pod \"9c2cd420-059d-4dd6-864d-ed13e6715244\" (UID: \"9c2cd420-059d-4dd6-864d-ed13e6715244\") " Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.360419 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host" (OuterVolumeSpecName: "host") pod "9c2cd420-059d-4dd6-864d-ed13e6715244" (UID: "9c2cd420-059d-4dd6-864d-ed13e6715244"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.360864 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c2cd420-059d-4dd6-864d-ed13e6715244-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.376113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp" (OuterVolumeSpecName: "kube-api-access-5kwhp") pod "9c2cd420-059d-4dd6-864d-ed13e6715244" (UID: "9c2cd420-059d-4dd6-864d-ed13e6715244"). InnerVolumeSpecName "kube-api-access-5kwhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:44:08 crc kubenswrapper[4790]: I0406 13:44:08.463315 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kwhp\" (UniqueName: \"kubernetes.io/projected/9c2cd420-059d-4dd6-864d-ed13e6715244-kube-api-access-5kwhp\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.116371 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bc122b1b9d2bc9b0c2c0f5b7d4cc0c6731d0c531be784598ca3b44cc4995609" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.116475 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-dp94d" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.687473 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2cd420-059d-4dd6-864d-ed13e6715244" path="/var/lib/kubelet/pods/9c2cd420-059d-4dd6-864d-ed13e6715244/volumes" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.779058 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-hl4cj"] Apr 06 13:44:09 crc kubenswrapper[4790]: E0406 13:44:09.779501 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25452c01-bc42-4b7a-82c4-ea776612c95d" containerName="oc" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.779521 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25452c01-bc42-4b7a-82c4-ea776612c95d" containerName="oc" Apr 06 13:44:09 crc kubenswrapper[4790]: E0406 13:44:09.779541 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2cd420-059d-4dd6-864d-ed13e6715244" containerName="container-00" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.779547 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2cd420-059d-4dd6-864d-ed13e6715244" containerName="container-00" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.779728 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2cd420-059d-4dd6-864d-ed13e6715244" containerName="container-00" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.779745 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="25452c01-bc42-4b7a-82c4-ea776612c95d" containerName="oc" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.780425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.890671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzvt\" (UniqueName: \"kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.890789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.993196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzvt\" (UniqueName: \"kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.993277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:09 crc kubenswrapper[4790]: I0406 13:44:09.993412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:10 crc kubenswrapper[4790]: I0406 13:44:10.016390 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzvt\" (UniqueName: \"kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt\") pod \"crc-debug-hl4cj\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:10 crc kubenswrapper[4790]: I0406 13:44:10.103087 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:11 crc kubenswrapper[4790]: I0406 13:44:11.143686 4790 generic.go:334] "Generic (PLEG): container finished" podID="0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" containerID="3f29bf59f2b6ef3d384ca6d05e8fc05152adad7be6d1447221db42c3d4b8ff94" exitCode=0 Apr 06 13:44:11 crc kubenswrapper[4790]: I0406 13:44:11.143750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" event={"ID":"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29","Type":"ContainerDied","Data":"3f29bf59f2b6ef3d384ca6d05e8fc05152adad7be6d1447221db42c3d4b8ff94"} Apr 06 13:44:11 crc kubenswrapper[4790]: I0406 13:44:11.144070 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" event={"ID":"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29","Type":"ContainerStarted","Data":"4a8efa900cb7d3597a8729aa32d6d7c0ad0f4894063b8fe122832f59702ea9c0"} Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.257058 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.348350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host\") pod \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.348452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzvt\" (UniqueName: \"kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt\") pod \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\" (UID: \"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29\") " Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.348926 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host" (OuterVolumeSpecName: "host") pod "0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" (UID: "0ff03cec-1642-4e6b-a2f6-a19c7aa24b29"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.350470 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.370563 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt" (OuterVolumeSpecName: "kube-api-access-pjzvt") pod "0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" (UID: "0ff03cec-1642-4e6b-a2f6-a19c7aa24b29"). InnerVolumeSpecName "kube-api-access-pjzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:44:12 crc kubenswrapper[4790]: I0406 13:44:12.453993 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzvt\" (UniqueName: \"kubernetes.io/projected/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29-kube-api-access-pjzvt\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.164626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" event={"ID":"0ff03cec-1642-4e6b-a2f6-a19c7aa24b29","Type":"ContainerDied","Data":"4a8efa900cb7d3597a8729aa32d6d7c0ad0f4894063b8fe122832f59702ea9c0"} Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.164676 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8efa900cb7d3597a8729aa32d6d7c0ad0f4894063b8fe122832f59702ea9c0" Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.164745 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-hl4cj" Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.277110 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-hl4cj"] Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.288003 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-hl4cj"] Apr 06 13:44:13 crc kubenswrapper[4790]: I0406 13:44:13.688705 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" path="/var/lib/kubelet/pods/0ff03cec-1642-4e6b-a2f6-a19c7aa24b29/volumes" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.497422 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-v6s4j"] Apr 06 13:44:14 crc kubenswrapper[4790]: E0406 13:44:14.498138 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" containerName="container-00" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.498151 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" containerName="container-00" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.498360 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff03cec-1642-4e6b-a2f6-a19c7aa24b29" containerName="container-00" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.499738 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.593954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vw2\" (UniqueName: \"kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.594198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.695738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vw2\" (UniqueName: \"kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.695884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.696068 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.718485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vw2\" (UniqueName: \"kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2\") pod \"crc-debug-v6s4j\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:14 crc kubenswrapper[4790]: I0406 13:44:14.840705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:15 crc kubenswrapper[4790]: I0406 13:44:15.182080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" event={"ID":"38ae3019-0b91-4832-b65b-5967cecde29b","Type":"ContainerStarted","Data":"9aa25aa4d87dc19036bf0f3d45732fdee8376d0d0e53b7a67cdba143d7febd5f"} Apr 06 13:44:16 crc kubenswrapper[4790]: I0406 13:44:16.192984 4790 generic.go:334] "Generic (PLEG): container finished" podID="38ae3019-0b91-4832-b65b-5967cecde29b" containerID="134522d2bc9cc419eab04113d3b3e7411747a3c5e74b78f70ee68be82d921c84" exitCode=0 Apr 06 13:44:16 crc kubenswrapper[4790]: I0406 13:44:16.193703 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" event={"ID":"38ae3019-0b91-4832-b65b-5967cecde29b","Type":"ContainerDied","Data":"134522d2bc9cc419eab04113d3b3e7411747a3c5e74b78f70ee68be82d921c84"} Apr 06 13:44:16 crc kubenswrapper[4790]: I0406 13:44:16.234320 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-v6s4j"] Apr 06 13:44:16 crc kubenswrapper[4790]: I0406 13:44:16.245934 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m5z6z/crc-debug-v6s4j"] Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.319382 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.401765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84vw2\" (UniqueName: \"kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2\") pod \"38ae3019-0b91-4832-b65b-5967cecde29b\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.401862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host\") pod \"38ae3019-0b91-4832-b65b-5967cecde29b\" (UID: \"38ae3019-0b91-4832-b65b-5967cecde29b\") " Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.402025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host" (OuterVolumeSpecName: "host") pod "38ae3019-0b91-4832-b65b-5967cecde29b" (UID: "38ae3019-0b91-4832-b65b-5967cecde29b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.402295 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38ae3019-0b91-4832-b65b-5967cecde29b-host\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.412653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2" (OuterVolumeSpecName: "kube-api-access-84vw2") pod "38ae3019-0b91-4832-b65b-5967cecde29b" (UID: "38ae3019-0b91-4832-b65b-5967cecde29b"). InnerVolumeSpecName "kube-api-access-84vw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.504200 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84vw2\" (UniqueName: \"kubernetes.io/projected/38ae3019-0b91-4832-b65b-5967cecde29b-kube-api-access-84vw2\") on node \"crc\" DevicePath \"\"" Apr 06 13:44:17 crc kubenswrapper[4790]: I0406 13:44:17.686076 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ae3019-0b91-4832-b65b-5967cecde29b" path="/var/lib/kubelet/pods/38ae3019-0b91-4832-b65b-5967cecde29b/volumes" Apr 06 13:44:18 crc kubenswrapper[4790]: I0406 13:44:18.212326 4790 scope.go:117] "RemoveContainer" containerID="134522d2bc9cc419eab04113d3b3e7411747a3c5e74b78f70ee68be82d921c84" Apr 06 13:44:18 crc kubenswrapper[4790]: I0406 13:44:18.212350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/crc-debug-v6s4j" Apr 06 13:44:20 crc kubenswrapper[4790]: I0406 13:44:20.521580 4790 scope.go:117] "RemoveContainer" containerID="90e2de3c744c434551fe298b6c4eda6c10785983079ec1a8d6d81f21b7a12a8c" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.152499 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b"] Apr 06 13:45:00 crc kubenswrapper[4790]: E0406 13:45:00.153544 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ae3019-0b91-4832-b65b-5967cecde29b" containerName="container-00" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.153562 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ae3019-0b91-4832-b65b-5967cecde29b" containerName="container-00" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.153918 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ae3019-0b91-4832-b65b-5967cecde29b" containerName="container-00" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.154849 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.156706 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.162090 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.165815 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b"] Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.252520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sfrn\" (UniqueName: \"kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.252633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.252659 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.354297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sfrn\" (UniqueName: \"kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.354443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.354481 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.355811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.365291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.382193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sfrn\" (UniqueName: \"kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn\") pod \"collect-profiles-29591385-mz22b\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:00 crc kubenswrapper[4790]: I0406 13:45:00.495140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:01 crc kubenswrapper[4790]: I0406 13:45:01.001514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b"] Apr 06 13:45:01 crc kubenswrapper[4790]: I0406 13:45:01.697575 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fdc6849-6d79-4327-a2db-3a49079d39ca" containerID="4c6538f28247541b41916e3190276772052c5a35d6bd0ac3d68b31aa8ceca128" exitCode=0 Apr 06 13:45:01 crc kubenswrapper[4790]: I0406 13:45:01.844621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" event={"ID":"5fdc6849-6d79-4327-a2db-3a49079d39ca","Type":"ContainerDied","Data":"4c6538f28247541b41916e3190276772052c5a35d6bd0ac3d68b31aa8ceca128"} Apr 06 13:45:01 crc kubenswrapper[4790]: I0406 13:45:01.844899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" event={"ID":"5fdc6849-6d79-4327-a2db-3a49079d39ca","Type":"ContainerStarted","Data":"33468ab7fc6ca44618362f6f7b31206c1d8d861698b69f1afae45b0306252d3b"} Apr 06 13:45:02 crc kubenswrapper[4790]: I0406 13:45:02.384707 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57899578c6-rh848_e2ee43de-a608-4710-a58d-60d49845cb7c/barbican-api/0.log" Apr 06 13:45:02 crc kubenswrapper[4790]: I0406 13:45:02.544326 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57899578c6-rh848_e2ee43de-a608-4710-a58d-60d49845cb7c/barbican-api-log/0.log" Apr 06 13:45:02 crc kubenswrapper[4790]: I0406 13:45:02.559697 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbdd57f54-rd6dk_d431242a-f2f0-4780-85d0-9f2cfc8573ac/barbican-keystone-listener/0.log" Apr 06 13:45:02 crc kubenswrapper[4790]: I0406 13:45:02.722024 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7cbdd57f54-rd6dk_d431242a-f2f0-4780-85d0-9f2cfc8573ac/barbican-keystone-listener-log/0.log" Apr 06 13:45:02 crc kubenswrapper[4790]: I0406 13:45:02.757135 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d568b5b57-5x8cp_2f7309c8-fdde-4e0e-9efa-ece286501ec5/barbican-worker/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:02.859936 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d568b5b57-5x8cp_2f7309c8-fdde-4e0e-9efa-ece286501ec5/barbican-worker-log/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.055008 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.160280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sfrn\" (UniqueName: \"kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn\") pod \"5fdc6849-6d79-4327-a2db-3a49079d39ca\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.160485 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume\") pod \"5fdc6849-6d79-4327-a2db-3a49079d39ca\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.160533 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume\") pod \"5fdc6849-6d79-4327-a2db-3a49079d39ca\" (UID: \"5fdc6849-6d79-4327-a2db-3a49079d39ca\") " Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.161682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fdc6849-6d79-4327-a2db-3a49079d39ca" (UID: "5fdc6849-6d79-4327-a2db-3a49079d39ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.168011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fdc6849-6d79-4327-a2db-3a49079d39ca" (UID: "5fdc6849-6d79-4327-a2db-3a49079d39ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.172355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn" (OuterVolumeSpecName: "kube-api-access-7sfrn") pod "5fdc6849-6d79-4327-a2db-3a49079d39ca" (UID: "5fdc6849-6d79-4327-a2db-3a49079d39ca"). InnerVolumeSpecName "kube-api-access-7sfrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.228567 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/ceilometer-central-agent/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.262802 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fdc6849-6d79-4327-a2db-3a49079d39ca-secret-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.262847 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fdc6849-6d79-4327-a2db-3a49079d39ca-config-volume\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.262859 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sfrn\" (UniqueName: \"kubernetes.io/projected/5fdc6849-6d79-4327-a2db-3a49079d39ca-kube-api-access-7sfrn\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.350587 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/proxy-httpd/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.407688 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dwh7n_34fd07c3-5b1c-440c-a0d0-3d9423f40cc8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.412689 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/ceilometer-notification-agent/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.428811 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6da6352b-82da-47b7-afe2-44baa8d546d3/sg-core/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.699621 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6acbb602-fcaa-448f-bc7a-49a2ac2bb979/cinder-api-log/0.log" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.718319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" event={"ID":"5fdc6849-6d79-4327-a2db-3a49079d39ca","Type":"ContainerDied","Data":"33468ab7fc6ca44618362f6f7b31206c1d8d861698b69f1afae45b0306252d3b"} Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.718365 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33468ab7fc6ca44618362f6f7b31206c1d8d861698b69f1afae45b0306252d3b" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.718425 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29591385-mz22b" Apr 06 13:45:03 crc kubenswrapper[4790]: I0406 13:45:03.958309 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cd529dba-04e1-45bf-9a0a-69fd93502cd9/probe/0.log" Apr 06 13:45:04 crc kubenswrapper[4790]: I0406 13:45:04.136073 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp"] Apr 06 13:45:04 crc kubenswrapper[4790]: I0406 13:45:04.148708 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29591340-grtzp"] Apr 06 13:45:04 crc kubenswrapper[4790]: I0406 13:45:04.284213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_207e3a4b-b763-47f7-b2f7-b25c8c929af5/cinder-scheduler/0.log" Apr 06 13:45:04 crc kubenswrapper[4790]: I0406 13:45:04.403239 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_207e3a4b-b763-47f7-b2f7-b25c8c929af5/probe/0.log" Apr 06 13:45:04 crc kubenswrapper[4790]: I0406 13:45:04.821915 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_f77070f6-164c-4bec-aafa-6126ca005702/probe/0.log" Apr 06 13:45:05 crc kubenswrapper[4790]: I0406 13:45:05.222676 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cd529dba-04e1-45bf-9a0a-69fd93502cd9/cinder-backup/0.log" Apr 06 13:45:05 crc kubenswrapper[4790]: I0406 13:45:05.562435 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_da98578a-8aaa-403a-8f8e-4c7115cfa2cb/probe/0.log" Apr 06 13:45:05 crc kubenswrapper[4790]: I0406 13:45:05.687271 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f69b634-b80e-4537-87d4-c5b827de18ac" path="/var/lib/kubelet/pods/2f69b634-b80e-4537-87d4-c5b827de18ac/volumes" Apr 06 13:45:05 crc kubenswrapper[4790]: I0406 13:45:05.796127 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_6acbb602-fcaa-448f-bc7a-49a2ac2bb979/cinder-api/0.log" Apr 06 13:45:05 crc kubenswrapper[4790]: I0406 13:45:05.961497 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_f77070f6-164c-4bec-aafa-6126ca005702/cinder-volume/0.log" Apr 06 13:45:06 crc kubenswrapper[4790]: I0406 13:45:06.328122 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-pwmmp_4e61f8b4-0263-4224-a5e0-b34740fbca06/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:06 crc kubenswrapper[4790]: I0406 13:45:06.367005 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_da98578a-8aaa-403a-8f8e-4c7115cfa2cb/cinder-volume/0.log" Apr 06 13:45:06 crc kubenswrapper[4790]: I0406 13:45:06.700882 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/init/0.log" Apr 06 13:45:06 crc kubenswrapper[4790]: I0406 13:45:06.745470 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fglxr_f694367f-e4c0-49b1-99f2-f22624011595/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.028145 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/init/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.064327 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-b7ngt_f73a2e40-f5e3-4e0e-9244-c076b36e911e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.244962 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85ff4b9c47-92mqr_13316504-6091-438b-8386-eb10fd6c7ce4/dnsmasq-dns/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.283784 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3c46af-c7be-417d-9a92-454f74da7a82/glance-log/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.330551 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3c46af-c7be-417d-9a92-454f74da7a82/glance-httpd/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.393389 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f86cbe3-945a-4c2a-8986-aa0443e28b95/glance-httpd/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.462080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f86cbe3-945a-4c2a-8986-aa0443e28b95/glance-log/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.639355 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-p6pc9_eb888063-8b8f-43f5-ba22-d7fc374a1bbf/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:07 crc kubenswrapper[4790]: I0406 13:45:07.858174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29591341-jdpjh_7657bf4f-84d6-4cc0-97da-ac70e2aa07de/keystone-cron/0.log" Apr 06 13:45:08 crc kubenswrapper[4790]: I0406 13:45:08.111416 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d656612b-aaad-4d40-bc05-3aae06b509f3/kube-state-metrics/0.log" Apr 06 13:45:08 crc kubenswrapper[4790]: I0406 13:45:08.523094 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b9b7c8b58-lzkxg_b215a7cd-f428-4fbd-adbc-307b6c905894/keystone-api/0.log" Apr 06 13:45:08 crc kubenswrapper[4790]: I0406 13:45:08.602090 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fph2d_500a529e-70f2-4749-8364-d6a6230b0030/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.052877 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bbf455d7c-c2ssf_cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df/neutron-httpd/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.074159 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:09 crc kubenswrapper[4790]: E0406 13:45:09.074705 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdc6849-6d79-4327-a2db-3a49079d39ca" containerName="collect-profiles" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.074728 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdc6849-6d79-4327-a2db-3a49079d39ca" containerName="collect-profiles" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.074995 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdc6849-6d79-4327-a2db-3a49079d39ca" containerName="collect-profiles" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.076853 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.090157 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.213561 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-bbf455d7c-c2ssf_cf1c464e-5f26-4b8f-a4a0-8555a4c8c7df/neutron-api/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.227572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxhm\" (UniqueName: \"kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.227626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.227701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.332285 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxhm\" (UniqueName: \"kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.332334 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.332365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.333216 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.335019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.353193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxhm\" (UniqueName: \"kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm\") pod \"certified-operators-dx6b5\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.417883 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.538533 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wklc4_9a366552-b5b9-4a9c-92a2-8b63981f5520/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.570788 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/setup-container/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.753582 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.753629 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.797684 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-48qfz_75d5d9f7-8482-4fb7-a536-55656709bec2/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:09 crc kubenswrapper[4790]: I0406 13:45:09.973624 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.091179 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/setup-container/0.log" Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.159077 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_notifications-rabbitmq-server-0_586bc227-b2c5-4ead-88f4-fe18c5c28d41/rabbitmq/0.log" Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.813790 4790 generic.go:334] "Generic (PLEG): container finished" podID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerID="66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938" exitCode=0 Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.814057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerDied","Data":"66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938"} Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.814098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerStarted","Data":"be884d95af75fcb1b53863efa03f8d5317d6d271a8d2bc650b9ebb8dcaac3c7a"} Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.817151 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:45:10 crc kubenswrapper[4790]: I0406 13:45:10.888195 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_001ce2db-6829-4dcc-bf3a-b19134cd3484/nova-cell0-conductor-conductor/0.log" Apr 06 13:45:11 crc kubenswrapper[4790]: I0406 13:45:11.396791 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_96ad1416-d1af-42b2-8fae-68574044a5e6/nova-cell1-conductor-conductor/0.log" Apr 06 13:45:11 crc kubenswrapper[4790]: I0406 13:45:11.585508 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8c07d38c-a3ad-48d8-948c-7659351eade5/nova-cell1-novncproxy-novncproxy/0.log" Apr 06 13:45:11 crc kubenswrapper[4790]: I0406 13:45:11.830497 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerStarted","Data":"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331"} Apr 06 13:45:12 crc kubenswrapper[4790]: I0406 13:45:12.126213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2cd7a2b3-4c64-4d09-9865-cd55277fd369/nova-api-log/0.log" Apr 06 13:45:12 crc kubenswrapper[4790]: I0406 13:45:12.236752 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8fa33d66-ad99-4650-bc60-a97e16cbd064/nova-metadata-log/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.167144 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-w9z4v_9be120eb-568b-4ab4-af61-b92818e7e6ad/nova-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.169534 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a73b707d-e57e-4f4c-a253-38e55128a1b2/nova-scheduler-scheduler/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.332093 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2cd7a2b3-4c64-4d09-9865-cd55277fd369/nova-api-api/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.406558 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8fa33d66-ad99-4650-bc60-a97e16cbd064/nova-metadata-metadata/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.470559 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/mysql-bootstrap/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.644376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/galera/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.671455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69e97903-5aa8-4523-ae3c-3f10b031ad20/mysql-bootstrap/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.731863 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/mysql-bootstrap/0.log" Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.848115 4790 generic.go:334] "Generic (PLEG): container finished" podID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerID="ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331" exitCode=0 Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.848164 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerDied","Data":"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331"} Apr 06 13:45:13 crc kubenswrapper[4790]: I0406 13:45:13.959171 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/mysql-bootstrap/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.003162 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6d6dc6ce-5627-454a-af1c-7a20bed8bfc4/galera/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.018362 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1eea2e2a-d9c2-46e2-96a9-827fcf5a075f/openstackclient/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.234443 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lw6ch_77333973-0908-43d8-8105-0c3b3e5cdecb/openstack-network-exporter/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.251848 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server-init/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.457065 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server-init/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.604599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovsdb-server/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.744595 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pr4b9_51949d72-301c-4426-8397-273f6b2ecabd/ovn-controller/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.792351 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4xbzr_3c164024-c78c-461b-90fc-afbac0b3a682/ovs-vswitchd/0.log" Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.864203 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerStarted","Data":"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890"} Apr 06 13:45:14 crc kubenswrapper[4790]: I0406 13:45:14.888571 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dx6b5" podStartSLOduration=2.442734393 podStartE2EDuration="5.888554985s" podCreationTimestamp="2026-04-06 13:45:09 +0000 UTC" firstStartedPulling="2026-04-06 13:45:10.816951788 +0000 UTC m=+6489.804694654" lastFinishedPulling="2026-04-06 13:45:14.26277238 +0000 UTC m=+6493.250515246" observedRunningTime="2026-04-06 13:45:14.886447768 +0000 UTC m=+6493.874190634" watchObservedRunningTime="2026-04-06 13:45:14.888554985 +0000 UTC m=+6493.876297851" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.061843 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c66b653e-0e7d-44cb-82e1-1e2ee6a04b15/openstack-network-exporter/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.068057 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-nbzm5_7809a0fa-81df-4e08-8a8f-84e070582795/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.216080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c66b653e-0e7d-44cb-82e1-1e2ee6a04b15/ovn-northd/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.256879 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6192fb44-8c5c-4bee-a190-cb14bce3fa94/openstack-network-exporter/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.323037 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6192fb44-8c5c-4bee-a190-cb14bce3fa94/ovsdbserver-nb/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.493139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7fb6737-ce1d-42b7-96e4-f1ea27883d05/openstack-network-exporter/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.561100 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b7fb6737-ce1d-42b7-96e4-f1ea27883d05/ovsdbserver-sb/0.log" Apr 06 13:45:15 crc kubenswrapper[4790]: I0406 13:45:15.799078 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/init-config-reloader/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.011633 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bffc46f4d-tqbdl_54d90c86-6e3b-49d3-a50f-eefe94ef8d6d/placement-api/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.032667 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bffc46f4d-tqbdl_54d90c86-6e3b-49d3-a50f-eefe94ef8d6d/placement-log/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.055462 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/init-config-reloader/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.120770 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/config-reloader/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.231366 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/prometheus/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.341390 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/setup-container/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.378356 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fc194d2b-4a4a-4745-8225-7d44efe056ef/thanos-sidecar/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.571642 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/setup-container/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.639616 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/setup-container/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.660365 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ecd623d8-83f3-46b0-b566-b9801c44dfc8/rabbitmq/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.810868 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/setup-container/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.872435 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d5c35395-30bf-42d7-89e4-d306b4e4cc37/rabbitmq/0.log" Apr 06 13:45:16 crc kubenswrapper[4790]: I0406 13:45:16.902433 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9wv6s_822b5a1d-5bf4-4e66-87fa-20a47f8cd280/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.094715 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-kwcth_c0c26165-8e10-4607-9cce-f36ec74bdc85/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.242485 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zv8g9_e671a5e8-99cd-4a96-a26f-93ff0eb8980c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.478618 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pmnvp_2cea3a85-493c-4732-9954-6a690708c4d1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.485145 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2w9cm_28b0126b-8513-425d-8079-b68b9cb73bdc/ssh-known-hosts-edpm-deployment/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.728093 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7695db8cdc-vs5bx_7922b939-a1e8-4c85-8eb0-fe3529f6469c/proxy-server/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.898520 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7695db8cdc-vs5bx_7922b939-a1e8-4c85-8eb0-fe3529f6469c/proxy-httpd/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.914061 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-x6pwr_8288b902-f791-4dce-b1c0-2afa8796712b/swift-ring-rebalance/0.log" Apr 06 13:45:17 crc kubenswrapper[4790]: I0406 13:45:17.988573 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-auditor/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.115504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-reaper/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.190053 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-replicator/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.231719 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/account-server/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.244429 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-auditor/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.429636 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-replicator/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.441509 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-server/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.501013 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-auditor/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.526053 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/container-updater/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.673299 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-expirer/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.726376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-replicator/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.759784 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-updater/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.855607 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/object-server/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.927381 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/rsync/0.log" Apr 06 13:45:18 crc kubenswrapper[4790]: I0406 13:45:18.953928 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3a39c210-842a-4286-8770-a84bbfec54a0/swift-recon-cron/0.log" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.239725 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c58fe7b4-f5be-433f-8390-67dd8a62e81b/tempest-tests-tempest-tests-runner/0.log" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.418363 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.418412 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.443555 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fe61003b-b427-4b3b-8af3-a4f9e0cf8605/test-operator-logs-container/0.log" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.479644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.545614 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pxptm_8e75f387-926a-41f4-8367-8c68d2637c04/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.774697 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-clrlg_b1da0dd0-f14d-4b72-8308-a256f237732f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Apr 06 13:45:19 crc kubenswrapper[4790]: I0406 13:45:19.983975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:20 crc kubenswrapper[4790]: I0406 13:45:20.032792 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:20 crc kubenswrapper[4790]: I0406 13:45:20.543619 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_bed4cbed-be09-43cd-938a-e4a1fe5fe399/watcher-applier/0.log" Apr 06 13:45:20 crc kubenswrapper[4790]: I0406 13:45:20.632628 4790 scope.go:117] "RemoveContainer" containerID="2bac2d1c345e91a26f90680ea25b8265f8c958cad66911048b3ff183ecf648fe" Apr 06 13:45:21 crc kubenswrapper[4790]: I0406 13:45:21.348743 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_b614b0dd-0285-4907-8e74-051e3ef0b3a1/watcher-api-log/0.log" Apr 06 13:45:21 crc kubenswrapper[4790]: I0406 13:45:21.930368 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dx6b5" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="registry-server" containerID="cri-o://141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890" gracePeriod=2 Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.404963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.590672 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kxhm\" (UniqueName: \"kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm\") pod \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.590719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content\") pod \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.590977 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities\") pod \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\" (UID: \"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17\") " Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.591970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities" (OuterVolumeSpecName: "utilities") pod "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" (UID: "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.610018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm" (OuterVolumeSpecName: "kube-api-access-8kxhm") pod "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" (UID: "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17"). InnerVolumeSpecName "kube-api-access-8kxhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.656607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" (UID: "cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.692974 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.693008 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kxhm\" (UniqueName: \"kubernetes.io/projected/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-kube-api-access-8kxhm\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.693018 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.940294 4790 generic.go:334] "Generic (PLEG): container finished" podID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerID="141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890" exitCode=0 Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.940373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerDied","Data":"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890"} Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.940447 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6b5" event={"ID":"cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17","Type":"ContainerDied","Data":"be884d95af75fcb1b53863efa03f8d5317d6d271a8d2bc650b9ebb8dcaac3c7a"} Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.940470 4790 scope.go:117] "RemoveContainer" containerID="141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.940513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6b5" Apr 06 13:45:22 crc kubenswrapper[4790]: I0406 13:45:22.971884 4790 scope.go:117] "RemoveContainer" containerID="ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.033997 4790 scope.go:117] "RemoveContainer" containerID="66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.043431 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.056328 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dx6b5"] Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.121165 4790 scope.go:117] "RemoveContainer" containerID="141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890" Apr 06 13:45:23 crc kubenswrapper[4790]: E0406 13:45:23.128007 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890\": container with ID starting with 141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890 not found: ID does not exist" containerID="141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.128068 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890"} err="failed to get container status \"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890\": rpc error: code = NotFound desc = could not find container \"141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890\": container with ID starting with 141bff306471ccbfb45f5ff753ec8b27cf5d0c6a7034024947d9c3b81d944890 not found: ID does not exist" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.128120 4790 scope.go:117] "RemoveContainer" containerID="ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331" Apr 06 13:45:23 crc kubenswrapper[4790]: E0406 13:45:23.135043 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331\": container with ID starting with ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331 not found: ID does not exist" containerID="ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.135090 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331"} err="failed to get container status \"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331\": rpc error: code = NotFound desc = could not find container \"ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331\": container with ID starting with ec1dc2ec2f1ef5e73126241fb8400ffec4d3c72c751a7e597868ac67737a7331 not found: ID does not exist" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.135121 4790 scope.go:117] "RemoveContainer" containerID="66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938" Apr 06 13:45:23 crc kubenswrapper[4790]: E0406 13:45:23.135484 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938\": container with ID starting with 66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938 not found: ID does not exist" containerID="66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.135504 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938"} err="failed to get container status \"66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938\": rpc error: code = NotFound desc = could not find container \"66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938\": container with ID starting with 66c5c806070fd20668e3c43122b5453673df18d06f106cb7f7b8d53e53088938 not found: ID does not exist" Apr 06 13:45:23 crc kubenswrapper[4790]: I0406 13:45:23.688818 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" path="/var/lib/kubelet/pods/cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17/volumes" Apr 06 13:45:24 crc kubenswrapper[4790]: I0406 13:45:24.822857 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_6f7f1a8b-0f45-447c-8cef-e0701c1ce1e4/watcher-decision-engine/0.log" Apr 06 13:45:26 crc kubenswrapper[4790]: I0406 13:45:26.154855 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_b614b0dd-0285-4907-8e74-051e3ef0b3a1/watcher-api/0.log" Apr 06 13:45:37 crc kubenswrapper[4790]: I0406 13:45:37.012260 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ace98862-e7bc-4eb8-93ae-b38dcbd98a55/memcached/0.log" Apr 06 13:45:39 crc kubenswrapper[4790]: I0406 13:45:39.753297 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:45:39 crc kubenswrapper[4790]: I0406 13:45:39.753582 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.451197 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bcc684c66-wv5tw_95b18d6e-ec5a-45e7-89c0-0f4618e4eb97/manager/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.524149 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.701128 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.702856 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.744174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.879254 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/pull/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.887375 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/extract/0.log" Apr 06 13:45:51 crc kubenswrapper[4790]: I0406 13:45:51.895438 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ce6f811e2377ca8c98122c8d71d53f0f786cdc1e7ce1a323c442975008pxdtm_7a436f45-6a0e-4102-8377-fa7299c0b3e8/util/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.063919 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78674bbc6b-48jqq_5878c1d4-78cf-447f-b442-f7a9aa1aee99/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.091907 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58689c6fff-tm8b2_fb176575-b24c-4da4-a0f7-c5117d2c2ed7/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.278511 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8566787df9-l8dhs_16027ea9-802c-43ef-80ac-e2f66a2cc36b/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.306508 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b5d8f8697-hwvv8_a8b6d51d-4671-471f-94c6-b0a4b2c4a27d/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.458075 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6c5d8948dc-288vm_c2958357-3518-4e74-8326-cfe8cf23334f/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.674908 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b9c989bb6-z8t6s_75ce52d5-3320-40e1-8d64-42d12e2fa4c8/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.853215 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-dbf8bb784-6vkf8_30132a58-4c7d-4761-b73b-6d0ee27ea74e/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.897581 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-88ccbfc66-9pp57_9430e8f1-17ec-4eff-8d9c-d54553956f8d/manager/0.log" Apr 06 13:45:52 crc kubenswrapper[4790]: I0406 13:45:52.953734 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fddf8d98f-qrxjw_ef128b3d-ea70-46cc-8928-0a557b6fbf5d/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.093095 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-765cb856bd-7vfjz_b3d53b02-ceb5-46c3-820d-8a3b5c65f9e8/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.198056 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-9bdbb8fd8-r64xk_82a102f6-1fb6-4f30-8f0e-d3c4352b187e/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.343491 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64744474b-lxkq7_2f59b367-b3b9-467b-b190-5492ec84d98c/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.403653 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7594f57946-dlxrd_6d9a1f3c-f00e-498c-ae3f-3af6c407d051/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.523966 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6bb9db6d9-xbwmw_f9c22535-24c4-416f-98ef-fcd0299921c4/manager/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.712963 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-95748b946-k6fbn_a7adf6fa-28d9-4655-b720-606ab4b91117/operator/0.log" Apr 06 13:45:53 crc kubenswrapper[4790]: I0406 13:45:53.933148 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-s94ds_cd1445b3-61f1-4be0-aaf8-bee9a755cb7e/registry-server/0.log" Apr 06 13:45:54 crc kubenswrapper[4790]: I0406 13:45:54.077225 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-565fbbfdc9-msh7n_30dc85a8-d293-4324-b4af-f3b7731a5060/manager/0.log" Apr 06 13:45:54 crc kubenswrapper[4790]: I0406 13:45:54.261994 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-559d8fdb6b-9t6v2_96d39030-fa50-4568-a068-af079a592dc0/manager/0.log" Apr 06 13:45:54 crc kubenswrapper[4790]: I0406 13:45:54.469420 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-d46gz_bb09d720-75af-43a7-90dd-e497d2933183/operator/0.log" Apr 06 13:45:54 crc kubenswrapper[4790]: I0406 13:45:54.719448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c4dd9cdf6-kjg4j_c9e8a19f-ad0a-45ed-a45f-240d2e5d187b/manager/0.log" Apr 06 13:45:54 crc kubenswrapper[4790]: I0406 13:45:54.935161 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56bf57d759-dq7bm_d4596119-7d30-4307-8815-4355fc5ee6eb/manager/0.log" Apr 06 13:45:55 crc kubenswrapper[4790]: I0406 13:45:55.085074 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-569745d4d8-ddglf_7a8fad23-b18e-4933-af57-3e06aee00225/manager/0.log" Apr 06 13:45:55 crc kubenswrapper[4790]: I0406 13:45:55.156922 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5d8c8cd5bb-5w2d2_61665396-3382-4fa4-8d6a-706f47b2c5b0/manager/0.log" Apr 06 13:45:55 crc kubenswrapper[4790]: I0406 13:45:55.314775 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75df5978c-vvf85_56eefd7b-c275-40a3-8772-03ffc350736e/manager/0.log" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.180264 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591386-hnf6h"] Apr 06 13:46:00 crc kubenswrapper[4790]: E0406 13:46:00.184680 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="extract-utilities" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.184715 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="extract-utilities" Apr 06 13:46:00 crc kubenswrapper[4790]: E0406 13:46:00.184724 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="registry-server" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.184733 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="registry-server" Apr 06 13:46:00 crc kubenswrapper[4790]: E0406 13:46:00.184769 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="extract-content" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.184775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="extract-content" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.185087 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd40022d-6edc-4f2a-a9f7-44cf5a9d2d17" containerName="registry-server" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.186057 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.188113 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.189231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.190107 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.196405 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591386-hnf6h"] Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.365046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckd2p\" (UniqueName: \"kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p\") pod \"auto-csr-approver-29591386-hnf6h\" (UID: \"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98\") " pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.467126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckd2p\" (UniqueName: \"kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p\") pod \"auto-csr-approver-29591386-hnf6h\" (UID: \"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98\") " pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.487465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckd2p\" (UniqueName: \"kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p\") pod \"auto-csr-approver-29591386-hnf6h\" (UID: \"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98\") " pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:00 crc kubenswrapper[4790]: I0406 13:46:00.506118 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:01 crc kubenswrapper[4790]: I0406 13:46:01.011143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591386-hnf6h"] Apr 06 13:46:01 crc kubenswrapper[4790]: I0406 13:46:01.298067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" event={"ID":"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98","Type":"ContainerStarted","Data":"07692cc8e4263277c435eb993f50ede1cfc1d5b9719aaa132d92542837bc952a"} Apr 06 13:46:02 crc kubenswrapper[4790]: I0406 13:46:02.310260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" event={"ID":"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98","Type":"ContainerStarted","Data":"7aa14fed21e429a86acd22cfecb5bd1146eb1b8035329790e0befa9bc5cca839"} Apr 06 13:46:02 crc kubenswrapper[4790]: I0406 13:46:02.326591 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" podStartSLOduration=1.5025649589999999 podStartE2EDuration="2.326568941s" podCreationTimestamp="2026-04-06 13:46:00 +0000 UTC" firstStartedPulling="2026-04-06 13:46:01.000224403 +0000 UTC m=+6539.987967269" lastFinishedPulling="2026-04-06 13:46:01.824228385 +0000 UTC m=+6540.811971251" observedRunningTime="2026-04-06 13:46:02.322596325 +0000 UTC m=+6541.310339191" watchObservedRunningTime="2026-04-06 13:46:02.326568941 +0000 UTC m=+6541.314311807" Apr 06 13:46:03 crc kubenswrapper[4790]: I0406 13:46:03.320230 4790 generic.go:334] "Generic (PLEG): container finished" podID="f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" containerID="7aa14fed21e429a86acd22cfecb5bd1146eb1b8035329790e0befa9bc5cca839" exitCode=0 Apr 06 13:46:03 crc kubenswrapper[4790]: I0406 13:46:03.320285 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" event={"ID":"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98","Type":"ContainerDied","Data":"7aa14fed21e429a86acd22cfecb5bd1146eb1b8035329790e0befa9bc5cca839"} Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.710295 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.819675 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591380-wpppm"] Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.833198 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591380-wpppm"] Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.871709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckd2p\" (UniqueName: \"kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p\") pod \"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98\" (UID: \"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98\") " Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.887627 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p" (OuterVolumeSpecName: "kube-api-access-ckd2p") pod "f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" (UID: "f52809e0-c4d5-4f5a-969c-57f8ca1e5d98"). InnerVolumeSpecName "kube-api-access-ckd2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:46:04 crc kubenswrapper[4790]: I0406 13:46:04.974325 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckd2p\" (UniqueName: \"kubernetes.io/projected/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98-kube-api-access-ckd2p\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:05 crc kubenswrapper[4790]: I0406 13:46:05.347805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" event={"ID":"f52809e0-c4d5-4f5a-969c-57f8ca1e5d98","Type":"ContainerDied","Data":"07692cc8e4263277c435eb993f50ede1cfc1d5b9719aaa132d92542837bc952a"} Apr 06 13:46:05 crc kubenswrapper[4790]: I0406 13:46:05.348144 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07692cc8e4263277c435eb993f50ede1cfc1d5b9719aaa132d92542837bc952a" Apr 06 13:46:05 crc kubenswrapper[4790]: I0406 13:46:05.348218 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591386-hnf6h" Apr 06 13:46:05 crc kubenswrapper[4790]: I0406 13:46:05.687381 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1571e9f1-49ec-4438-94a5-ffb24d70f4c1" path="/var/lib/kubelet/pods/1571e9f1-49ec-4438-94a5-ffb24d70f4c1/volumes" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.043780 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:07 crc kubenswrapper[4790]: E0406 13:46:07.044564 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" containerName="oc" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.044580 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" containerName="oc" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.044866 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" containerName="oc" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.046687 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.056735 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.116248 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8l4k\" (UniqueName: \"kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.116676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.116794 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.218902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.219221 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.219272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8l4k\" (UniqueName: \"kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.219611 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.219635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.248317 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8l4k\" (UniqueName: \"kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k\") pod \"community-operators-8gf9r\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.370536 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:07 crc kubenswrapper[4790]: I0406 13:46:07.904470 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:08 crc kubenswrapper[4790]: I0406 13:46:08.377583 4790 generic.go:334] "Generic (PLEG): container finished" podID="b5057f4f-f92d-412a-9889-4389b5be4224" containerID="a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f" exitCode=0 Apr 06 13:46:08 crc kubenswrapper[4790]: I0406 13:46:08.377852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerDied","Data":"a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f"} Apr 06 13:46:08 crc kubenswrapper[4790]: I0406 13:46:08.377878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerStarted","Data":"80e94371bd9750d546c656cc8600df97c2d8ac0698cbda359b0bf4c83a0edd61"} Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.641500 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.643942 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.665991 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.675249 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkml\" (UniqueName: \"kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.675342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.675389 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.753318 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.753387 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.776617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkml\" (UniqueName: \"kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.776707 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.776748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.777631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.779097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:09 crc kubenswrapper[4790]: I0406 13:46:09.796246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkml\" (UniqueName: \"kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml\") pod \"redhat-marketplace-l29bv\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.008687 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.317079 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.317781 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.317850 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1" gracePeriod=600 Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.326178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerStarted","Data":"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451"} Apr 06 13:46:10 crc kubenswrapper[4790]: I0406 13:46:10.859453 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.353820 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1" exitCode=0 Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.354427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1"} Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.354459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe"} Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.354480 4790 scope.go:117] "RemoveContainer" containerID="692153cd87f55ae50bf86a4b572c770ef5fd45cbd7014fcd9c92ff45e88aa149" Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.366574 4790 generic.go:334] "Generic (PLEG): container finished" podID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerID="a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267" exitCode=0 Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.368082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerDied","Data":"a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267"} Apr 06 13:46:11 crc kubenswrapper[4790]: I0406 13:46:11.368109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerStarted","Data":"d334da6b425ac0d72a894bcdf7cd7e3e2a7931c1a59355d8babb31522be2a4c5"} Apr 06 13:46:12 crc kubenswrapper[4790]: I0406 13:46:12.378440 4790 generic.go:334] "Generic (PLEG): container finished" podID="b5057f4f-f92d-412a-9889-4389b5be4224" containerID="883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451" exitCode=0 Apr 06 13:46:12 crc kubenswrapper[4790]: I0406 13:46:12.378482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerDied","Data":"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451"} Apr 06 13:46:12 crc kubenswrapper[4790]: I0406 13:46:12.383204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerStarted","Data":"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef"} Apr 06 13:46:13 crc kubenswrapper[4790]: I0406 13:46:13.407647 4790 generic.go:334] "Generic (PLEG): container finished" podID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerID="3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef" exitCode=0 Apr 06 13:46:13 crc kubenswrapper[4790]: I0406 13:46:13.407708 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerDied","Data":"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef"} Apr 06 13:46:13 crc kubenswrapper[4790]: I0406 13:46:13.411011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerStarted","Data":"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d"} Apr 06 13:46:13 crc kubenswrapper[4790]: I0406 13:46:13.451626 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8gf9r" podStartSLOduration=2.064364074 podStartE2EDuration="6.451604077s" podCreationTimestamp="2026-04-06 13:46:07 +0000 UTC" firstStartedPulling="2026-04-06 13:46:08.379722576 +0000 UTC m=+6547.367465442" lastFinishedPulling="2026-04-06 13:46:12.766962579 +0000 UTC m=+6551.754705445" observedRunningTime="2026-04-06 13:46:13.442869523 +0000 UTC m=+6552.430612399" watchObservedRunningTime="2026-04-06 13:46:13.451604077 +0000 UTC m=+6552.439346943" Apr 06 13:46:14 crc kubenswrapper[4790]: I0406 13:46:14.430908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerStarted","Data":"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d"} Apr 06 13:46:14 crc kubenswrapper[4790]: I0406 13:46:14.457066 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l29bv" podStartSLOduration=2.9819727 podStartE2EDuration="5.457048568s" podCreationTimestamp="2026-04-06 13:46:09 +0000 UTC" firstStartedPulling="2026-04-06 13:46:11.368683109 +0000 UTC m=+6550.356425985" lastFinishedPulling="2026-04-06 13:46:13.843758987 +0000 UTC m=+6552.831501853" observedRunningTime="2026-04-06 13:46:14.452063615 +0000 UTC m=+6553.439806471" watchObservedRunningTime="2026-04-06 13:46:14.457048568 +0000 UTC m=+6553.444791434" Apr 06 13:46:16 crc kubenswrapper[4790]: I0406 13:46:16.457874 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x74dz_35f420a5-ec2f-4d37-94ea-af000df33824/control-plane-machine-set-operator/0.log" Apr 06 13:46:16 crc kubenswrapper[4790]: I0406 13:46:16.649236 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fr5xt_5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e/machine-api-operator/0.log" Apr 06 13:46:16 crc kubenswrapper[4790]: I0406 13:46:16.675972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fr5xt_5ba93a5d-6465-4bfc-8b7e-688c4d1f5c4e/kube-rbac-proxy/0.log" Apr 06 13:46:17 crc kubenswrapper[4790]: I0406 13:46:17.371343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:17 crc kubenswrapper[4790]: I0406 13:46:17.371644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:17 crc kubenswrapper[4790]: I0406 13:46:17.432609 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:17 crc kubenswrapper[4790]: I0406 13:46:17.514461 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:18 crc kubenswrapper[4790]: I0406 13:46:18.438487 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:19 crc kubenswrapper[4790]: I0406 13:46:19.479074 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8gf9r" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="registry-server" containerID="cri-o://d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d" gracePeriod=2 Apr 06 13:46:19 crc kubenswrapper[4790]: I0406 13:46:19.982620 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.010182 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.011173 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.091271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content\") pod \"b5057f4f-f92d-412a-9889-4389b5be4224\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.091386 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities\") pod \"b5057f4f-f92d-412a-9889-4389b5be4224\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.091501 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8l4k\" (UniqueName: \"kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k\") pod \"b5057f4f-f92d-412a-9889-4389b5be4224\" (UID: \"b5057f4f-f92d-412a-9889-4389b5be4224\") " Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.094880 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities" (OuterVolumeSpecName: "utilities") pod "b5057f4f-f92d-412a-9889-4389b5be4224" (UID: "b5057f4f-f92d-412a-9889-4389b5be4224"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.101581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k" (OuterVolumeSpecName: "kube-api-access-b8l4k") pod "b5057f4f-f92d-412a-9889-4389b5be4224" (UID: "b5057f4f-f92d-412a-9889-4389b5be4224"). InnerVolumeSpecName "kube-api-access-b8l4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.122542 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.146080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5057f4f-f92d-412a-9889-4389b5be4224" (UID: "b5057f4f-f92d-412a-9889-4389b5be4224"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.193479 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8l4k\" (UniqueName: \"kubernetes.io/projected/b5057f4f-f92d-412a-9889-4389b5be4224-kube-api-access-b8l4k\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.193522 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.193532 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5057f4f-f92d-412a-9889-4389b5be4224-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.489144 4790 generic.go:334] "Generic (PLEG): container finished" podID="b5057f4f-f92d-412a-9889-4389b5be4224" containerID="d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d" exitCode=0 Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.490346 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8gf9r" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.492363 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerDied","Data":"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d"} Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.492431 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8gf9r" event={"ID":"b5057f4f-f92d-412a-9889-4389b5be4224","Type":"ContainerDied","Data":"80e94371bd9750d546c656cc8600df97c2d8ac0698cbda359b0bf4c83a0edd61"} Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.492455 4790 scope.go:117] "RemoveContainer" containerID="d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.524114 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.526139 4790 scope.go:117] "RemoveContainer" containerID="883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.536088 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8gf9r"] Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.557864 4790 scope.go:117] "RemoveContainer" containerID="a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.582170 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.626637 4790 scope.go:117] "RemoveContainer" containerID="d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d" Apr 06 13:46:20 crc kubenswrapper[4790]: E0406 13:46:20.627135 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d\": container with ID starting with d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d not found: ID does not exist" containerID="d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.627162 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d"} err="failed to get container status \"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d\": rpc error: code = NotFound desc = could not find container \"d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d\": container with ID starting with d53d17882344fc60b3ff890a20b701c411149873e74ebd4ca2286882950da79d not found: ID does not exist" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.627229 4790 scope.go:117] "RemoveContainer" containerID="883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451" Apr 06 13:46:20 crc kubenswrapper[4790]: E0406 13:46:20.627541 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451\": container with ID starting with 883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451 not found: ID does not exist" containerID="883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.627563 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451"} err="failed to get container status \"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451\": rpc error: code = NotFound desc = could not find container \"883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451\": container with ID starting with 883f7f9aa702375d7fafa24a238ce374d6c95b67404d27b0be0c576c39eb2451 not found: ID does not exist" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.627577 4790 scope.go:117] "RemoveContainer" containerID="a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f" Apr 06 13:46:20 crc kubenswrapper[4790]: E0406 13:46:20.627776 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f\": container with ID starting with a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f not found: ID does not exist" containerID="a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.627824 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f"} err="failed to get container status \"a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f\": rpc error: code = NotFound desc = could not find container \"a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f\": container with ID starting with a933b4f9ac8336d3066731b293b03e44244e62a47a697b8f5c95b27bd457674f not found: ID does not exist" Apr 06 13:46:20 crc kubenswrapper[4790]: I0406 13:46:20.707143 4790 scope.go:117] "RemoveContainer" containerID="ef194237e8b2aab1be9c94e4137c40610eef2501666a5ded2f9a8217cae99076" Apr 06 13:46:21 crc kubenswrapper[4790]: I0406 13:46:21.691635 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" path="/var/lib/kubelet/pods/b5057f4f-f92d-412a-9889-4389b5be4224/volumes" Apr 06 13:46:22 crc kubenswrapper[4790]: I0406 13:46:22.433225 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:23 crc kubenswrapper[4790]: I0406 13:46:23.514822 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l29bv" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="registry-server" containerID="cri-o://26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d" gracePeriod=2 Apr 06 13:46:23 crc kubenswrapper[4790]: I0406 13:46:23.918937 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.105951 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qkml\" (UniqueName: \"kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml\") pod \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.106120 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content\") pod \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.106210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities\") pod \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\" (UID: \"e97760ac-4a0d-4b69-aa01-ae13edbd3480\") " Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.107142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities" (OuterVolumeSpecName: "utilities") pod "e97760ac-4a0d-4b69-aa01-ae13edbd3480" (UID: "e97760ac-4a0d-4b69-aa01-ae13edbd3480"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.117670 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml" (OuterVolumeSpecName: "kube-api-access-9qkml") pod "e97760ac-4a0d-4b69-aa01-ae13edbd3480" (UID: "e97760ac-4a0d-4b69-aa01-ae13edbd3480"). InnerVolumeSpecName "kube-api-access-9qkml". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.133158 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e97760ac-4a0d-4b69-aa01-ae13edbd3480" (UID: "e97760ac-4a0d-4b69-aa01-ae13edbd3480"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.208604 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.208806 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qkml\" (UniqueName: \"kubernetes.io/projected/e97760ac-4a0d-4b69-aa01-ae13edbd3480-kube-api-access-9qkml\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.208917 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97760ac-4a0d-4b69-aa01-ae13edbd3480-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.526016 4790 generic.go:334] "Generic (PLEG): container finished" podID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerID="26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d" exitCode=0 Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.526056 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerDied","Data":"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d"} Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.526313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l29bv" event={"ID":"e97760ac-4a0d-4b69-aa01-ae13edbd3480","Type":"ContainerDied","Data":"d334da6b425ac0d72a894bcdf7cd7e3e2a7931c1a59355d8babb31522be2a4c5"} Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.526335 4790 scope.go:117] "RemoveContainer" containerID="26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.526114 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l29bv" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.566146 4790 scope.go:117] "RemoveContainer" containerID="3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.574714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.583521 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l29bv"] Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.592041 4790 scope.go:117] "RemoveContainer" containerID="a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.640358 4790 scope.go:117] "RemoveContainer" containerID="26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d" Apr 06 13:46:24 crc kubenswrapper[4790]: E0406 13:46:24.640788 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d\": container with ID starting with 26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d not found: ID does not exist" containerID="26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.640817 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d"} err="failed to get container status \"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d\": rpc error: code = NotFound desc = could not find container \"26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d\": container with ID starting with 26da6788afa1a3b755db0d4c381236bdac12ac005862375044858d46003eab0d not found: ID does not exist" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.640867 4790 scope.go:117] "RemoveContainer" containerID="3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef" Apr 06 13:46:24 crc kubenswrapper[4790]: E0406 13:46:24.641132 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef\": container with ID starting with 3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef not found: ID does not exist" containerID="3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.641177 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef"} err="failed to get container status \"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef\": rpc error: code = NotFound desc = could not find container \"3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef\": container with ID starting with 3afc503f90ebd3a62f5cdff8d4e9d3755267be1b2440146d3b4cb6670b3d40ef not found: ID does not exist" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.641207 4790 scope.go:117] "RemoveContainer" containerID="a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267" Apr 06 13:46:24 crc kubenswrapper[4790]: E0406 13:46:24.641492 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267\": container with ID starting with a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267 not found: ID does not exist" containerID="a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267" Apr 06 13:46:24 crc kubenswrapper[4790]: I0406 13:46:24.641514 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267"} err="failed to get container status \"a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267\": rpc error: code = NotFound desc = could not find container \"a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267\": container with ID starting with a0c761b802ee06c45570ef5aca91d1ff082b230c86b9c9be9d950213b2639267 not found: ID does not exist" Apr 06 13:46:25 crc kubenswrapper[4790]: I0406 13:46:25.689913 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" path="/var/lib/kubelet/pods/e97760ac-4a0d-4b69-aa01-ae13edbd3480/volumes" Apr 06 13:46:29 crc kubenswrapper[4790]: I0406 13:46:29.330012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jmqjq_7b2dea07-951f-4a31-ae96-5465449fbae8/cert-manager-controller/1.log" Apr 06 13:46:29 crc kubenswrapper[4790]: I0406 13:46:29.347197 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-jmqjq_7b2dea07-951f-4a31-ae96-5465449fbae8/cert-manager-controller/0.log" Apr 06 13:46:29 crc kubenswrapper[4790]: I0406 13:46:29.516771 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n5t6m_ec02624d-3d5a-423d-818a-1422646a42a9/cert-manager-cainjector/0.log" Apr 06 13:46:29 crc kubenswrapper[4790]: I0406 13:46:29.548021 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n5t6m_ec02624d-3d5a-423d-818a-1422646a42a9/cert-manager-cainjector/1.log" Apr 06 13:46:29 crc kubenswrapper[4790]: I0406 13:46:29.679939 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-tm4jb_0aeb284c-cf28-4dc2-aa1c-d43f80e4fba7/cert-manager-webhook/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.458696 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7b5ddc4dc7-hx2t5_34686f03-565c-4d7b-a0d5-f3b2d93e77dd/nmstate-console-plugin/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.590214 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rdg86_fb4a8135-7355-406b-a851-dce0109face5/nmstate-handler/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.625989 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-khm89_baf52e27-af4b-4863-9e09-a3f11f497db9/kube-rbac-proxy/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.714767 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-khm89_baf52e27-af4b-4863-9e09-a3f11f497db9/nmstate-metrics/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.824791 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6b8c6447b-srf9k_8852468a-2987-493e-bdd3-1a0e0b3b0721/nmstate-operator/0.log" Apr 06 13:46:41 crc kubenswrapper[4790]: I0406 13:46:41.889126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-5vn4v_8691bf62-154a-4c5b-8d00-066f07c030fa/nmstate-webhook/0.log" Apr 06 13:46:54 crc kubenswrapper[4790]: I0406 13:46:54.996606 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-86dff4bf76-bm9zd_ae41d528-6f4b-45e5-84f2-5d9eae998759/prometheus-operator/0.log" Apr 06 13:46:55 crc kubenswrapper[4790]: I0406 13:46:55.180629 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw_ea4918c9-2a05-4c75-9d68-662e0a0fc175/prometheus-operator-admission-webhook/0.log" Apr 06 13:46:55 crc kubenswrapper[4790]: I0406 13:46:55.241966 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-pw54j_06f8ee69-3814-40ed-8ed2-5913509658de/prometheus-operator-admission-webhook/0.log" Apr 06 13:46:55 crc kubenswrapper[4790]: I0406 13:46:55.375303 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-dd944d769-7fscl_d990eb66-396e-4b05-acab-eaa30a6fbd34/operator/0.log" Apr 06 13:46:55 crc kubenswrapper[4790]: I0406 13:46:55.417600 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-74445bf4b8-spbpr_0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686/perses-operator/0.log" Apr 06 13:47:08 crc kubenswrapper[4790]: I0406 13:47:08.751252 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-lfmpj_fb497468-6169-47da-879b-96e49435e345/kube-rbac-proxy/0.log" Apr 06 13:47:08 crc kubenswrapper[4790]: I0406 13:47:08.881958 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bb64cd5d7-lfmpj_fb497468-6169-47da-879b-96e49435e345/controller/0.log" Apr 06 13:47:08 crc kubenswrapper[4790]: I0406 13:47:08.947646 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.117368 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.143258 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.152431 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.159972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.344534 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.367164 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.697580 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.699713 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.901595 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/controller/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.910142 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-frr-files/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.911479 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-reloader/0.log" Apr 06 13:47:09 crc kubenswrapper[4790]: I0406 13:47:09.933649 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/cp-metrics/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.074993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/frr-metrics/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.080405 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/kube-rbac-proxy/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.153016 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/kube-rbac-proxy-frr/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.329516 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/reloader/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.376729 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-j6z5c_7dd0b643-0c2a-467b-9267-caaba887289b/frr-k8s-webhook-server/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.652559 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cc74d7bc4-98drt_575f4b30-b909-40a3-aeee-d31e6b9238d5/manager/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.828805 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54f9f8cc7-p9gpk_d56b74e9-8981-420d-81d5-4b1a20286d52/webhook-server/0.log" Apr 06 13:47:10 crc kubenswrapper[4790]: I0406 13:47:10.911278 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zlz5n_5ca71ea5-d4c2-497d-945e-ac51b1fbf618/kube-rbac-proxy/0.log" Apr 06 13:47:11 crc kubenswrapper[4790]: I0406 13:47:11.578857 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zlz5n_5ca71ea5-d4c2-497d-945e-ac51b1fbf618/speaker/0.log" Apr 06 13:47:12 crc kubenswrapper[4790]: I0406 13:47:12.274870 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kv7hm_64824d1f-c308-4727-8209-26463699ba84/frr/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.241483 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.429497 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.491748 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.553683 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.700980 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/pull/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.726792 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/util/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.754555 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4d7975b5a0e69e817bb81974b87da6fc0f34c999595997c20f9b4da3efbgppj_89f6afcc-5eee-48c2-88fe-2bf924bd9a0c/extract/0.log" Apr 06 13:47:24 crc kubenswrapper[4790]: I0406 13:47:24.865302 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.029971 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.054029 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.074253 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.227937 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.231329 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.248234 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5899fb83590e6455a3f628fcbbcb1091eafa68b265018139ed84146645q9qn6_8a7a0dc8-d9f5-4844-bd9a-5377990f86c4/extract/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.420992 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.565640 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.585531 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.588703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.820646 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/util/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.833432 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/pull/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.867224 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397265hhf8_4b74456e-c45b-4efb-9a0f-952b5663e994/extract/0.log" Apr 06 13:47:25 crc kubenswrapper[4790]: I0406 13:47:25.990359 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.168270 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.192126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.219982 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.369981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-utilities/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.444236 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/extract-content/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.815328 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:47:26 crc kubenswrapper[4790]: I0406 13:47:26.989800 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.046781 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.072991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.178685 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kkc7k_31734947-0b62-4b08-a3f7-1547b401f159/registry-server/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.237921 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-utilities/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.251735 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/extract-content/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.417952 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.591767 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.667760 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.672275 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.974820 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/pull/0.log" Apr 06 13:47:27 crc kubenswrapper[4790]: I0406 13:47:27.975550 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/extract/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.015268 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dkfcxw_56ea36bf-7f2d-40b0-9f2d-cd4ef855d97c/util/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.174453 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.330569 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hxf27_7c08a9bd-28bf-46c6-8e2c-a9e65a1e65b8/registry-server/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.368969 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.390535 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.466757 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.591808 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/util/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.622054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/pull/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.631244 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_ff4f5ee0f799971b32a0c10c1c4dbe619002004ad8a7fdbcfb49ebac0dzh88v_4855f76c-a247-4c86-846c-ad5ecd18c434/extract/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.837775 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vm6jd_bcbcee2e-3daf-4238-ac27-16f663c8b184/marketplace-operator/0.log" Apr 06 13:47:28 crc kubenswrapper[4790]: I0406 13:47:28.862529 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.041347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.048918 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.080900 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.218238 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.255105 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.276884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/extract-content/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.497885 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-trpr6_1524c101-74ea-4a4a-b54f-c2f9201725e1/registry-server/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.503978 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.519939 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.548257 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.680731 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-utilities/0.log" Apr 06 13:47:29 crc kubenswrapper[4790]: I0406 13:47:29.730340 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/extract-content/0.log" Apr 06 13:47:30 crc kubenswrapper[4790]: I0406 13:47:30.389474 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z2nx9_0024beb8-cee9-427c-8267-657119a613c5/registry-server/0.log" Apr 06 13:47:42 crc kubenswrapper[4790]: I0406 13:47:42.928354 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-nfqzw_ea4918c9-2a05-4c75-9d68-662e0a0fc175/prometheus-operator-admission-webhook/0.log" Apr 06 13:47:42 crc kubenswrapper[4790]: I0406 13:47:42.954755 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-86dff4bf76-bm9zd_ae41d528-6f4b-45e5-84f2-5d9eae998759/prometheus-operator/0.log" Apr 06 13:47:42 crc kubenswrapper[4790]: I0406 13:47:42.959033 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-795cd6b797-pw54j_06f8ee69-3814-40ed-8ed2-5913509658de/prometheus-operator-admission-webhook/0.log" Apr 06 13:47:43 crc kubenswrapper[4790]: I0406 13:47:43.092174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-dd944d769-7fscl_d990eb66-396e-4b05-acab-eaa30a6fbd34/operator/0.log" Apr 06 13:47:43 crc kubenswrapper[4790]: I0406 13:47:43.119072 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-74445bf4b8-spbpr_0e6e8ad0-0549-4bcd-b2a7-4cbc5fc67686/perses-operator/0.log" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.150528 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591388-jxwld"] Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.153596 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.153720 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.153802 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="extract-utilities" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.153897 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="extract-utilities" Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.153994 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.154159 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="extract-utilities" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154236 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="extract-utilities" Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.154314 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="extract-content" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154384 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="extract-content" Apr 06 13:48:00 crc kubenswrapper[4790]: E0406 13:48:00.154464 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="extract-content" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154533 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="extract-content" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154896 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5057f4f-f92d-412a-9889-4389b5be4224" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.154991 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97760ac-4a0d-4b69-aa01-ae13edbd3480" containerName="registry-server" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.155765 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.158115 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.158168 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.158526 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.174622 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591388-jxwld"] Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.287195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhp2d\" (UniqueName: \"kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d\") pod \"auto-csr-approver-29591388-jxwld\" (UID: \"126a281e-40ec-4865-b1ad-c16824233159\") " pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.389121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhp2d\" (UniqueName: \"kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d\") pod \"auto-csr-approver-29591388-jxwld\" (UID: \"126a281e-40ec-4865-b1ad-c16824233159\") " pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.423803 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhp2d\" (UniqueName: \"kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d\") pod \"auto-csr-approver-29591388-jxwld\" (UID: \"126a281e-40ec-4865-b1ad-c16824233159\") " pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.474451 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:00 crc kubenswrapper[4790]: I0406 13:48:00.993752 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591388-jxwld"] Apr 06 13:48:01 crc kubenswrapper[4790]: I0406 13:48:01.452951 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591388-jxwld" event={"ID":"126a281e-40ec-4865-b1ad-c16824233159","Type":"ContainerStarted","Data":"ab1053d0a1d1377d00a3b2dd36a528c967c6e5a6e0b00b3d64724d57135025b9"} Apr 06 13:48:02 crc kubenswrapper[4790]: I0406 13:48:02.466103 4790 generic.go:334] "Generic (PLEG): container finished" podID="126a281e-40ec-4865-b1ad-c16824233159" containerID="0b0059157d34aafc699c266640ef03c679786cf7569cec5994a5782fb31ea501" exitCode=0 Apr 06 13:48:02 crc kubenswrapper[4790]: I0406 13:48:02.466379 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591388-jxwld" event={"ID":"126a281e-40ec-4865-b1ad-c16824233159","Type":"ContainerDied","Data":"0b0059157d34aafc699c266640ef03c679786cf7569cec5994a5782fb31ea501"} Apr 06 13:48:03 crc kubenswrapper[4790]: I0406 13:48:03.859450 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:03 crc kubenswrapper[4790]: I0406 13:48:03.975717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhp2d\" (UniqueName: \"kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d\") pod \"126a281e-40ec-4865-b1ad-c16824233159\" (UID: \"126a281e-40ec-4865-b1ad-c16824233159\") " Apr 06 13:48:03 crc kubenswrapper[4790]: I0406 13:48:03.982763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d" (OuterVolumeSpecName: "kube-api-access-dhp2d") pod "126a281e-40ec-4865-b1ad-c16824233159" (UID: "126a281e-40ec-4865-b1ad-c16824233159"). InnerVolumeSpecName "kube-api-access-dhp2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.078363 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhp2d\" (UniqueName: \"kubernetes.io/projected/126a281e-40ec-4865-b1ad-c16824233159-kube-api-access-dhp2d\") on node \"crc\" DevicePath \"\"" Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.485809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591388-jxwld" event={"ID":"126a281e-40ec-4865-b1ad-c16824233159","Type":"ContainerDied","Data":"ab1053d0a1d1377d00a3b2dd36a528c967c6e5a6e0b00b3d64724d57135025b9"} Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.486041 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab1053d0a1d1377d00a3b2dd36a528c967c6e5a6e0b00b3d64724d57135025b9" Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.485916 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591388-jxwld" Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.933469 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591382-kmgwd"] Apr 06 13:48:04 crc kubenswrapper[4790]: I0406 13:48:04.943313 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591382-kmgwd"] Apr 06 13:48:05 crc kubenswrapper[4790]: I0406 13:48:05.694052 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89e294a-9374-4837-8564-b56032dbc423" path="/var/lib/kubelet/pods/c89e294a-9374-4837-8564-b56032dbc423/volumes" Apr 06 13:48:20 crc kubenswrapper[4790]: I0406 13:48:20.847872 4790 scope.go:117] "RemoveContainer" containerID="2d3d22a27ecbefc7841340e86e2004e5f3e03f681a134bdbb8cd4ffa11448399" Apr 06 13:48:39 crc kubenswrapper[4790]: I0406 13:48:39.754445 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:48:39 crc kubenswrapper[4790]: I0406 13:48:39.757038 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:49:09 crc kubenswrapper[4790]: I0406 13:49:09.753438 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:49:09 crc kubenswrapper[4790]: I0406 13:49:09.754200 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:49:39 crc kubenswrapper[4790]: I0406 13:49:39.753819 4790 patch_prober.go:28] interesting pod/machine-config-daemon-9p96t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Apr 06 13:49:39 crc kubenswrapper[4790]: I0406 13:49:39.754588 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Apr 06 13:49:39 crc kubenswrapper[4790]: I0406 13:49:39.754657 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" Apr 06 13:49:39 crc kubenswrapper[4790]: I0406 13:49:39.755766 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe"} pod="openshift-machine-config-operator/machine-config-daemon-9p96t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Apr 06 13:49:39 crc kubenswrapper[4790]: I0406 13:49:39.755901 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerName="machine-config-daemon" containerID="cri-o://b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" gracePeriod=600 Apr 06 13:49:39 crc kubenswrapper[4790]: E0406 13:49:39.897278 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:49:40 crc kubenswrapper[4790]: I0406 13:49:40.614574 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f5e33f8-0490-4219-8c40-526903de8e6f" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" exitCode=0 Apr 06 13:49:40 crc kubenswrapper[4790]: I0406 13:49:40.614729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerDied","Data":"b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe"} Apr 06 13:49:40 crc kubenswrapper[4790]: I0406 13:49:40.614990 4790 scope.go:117] "RemoveContainer" containerID="44b6331698ad33af7a840e787e9e1d956dcb0f0acb6b4f4ab0ec2af758b872a1" Apr 06 13:49:40 crc kubenswrapper[4790]: I0406 13:49:40.615900 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:49:40 crc kubenswrapper[4790]: E0406 13:49:40.616537 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:49:53 crc kubenswrapper[4790]: I0406 13:49:53.675508 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:49:53 crc kubenswrapper[4790]: E0406 13:49:53.676259 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:49:53 crc kubenswrapper[4790]: I0406 13:49:53.745348 4790 generic.go:334] "Generic (PLEG): container finished" podID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerID="881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0" exitCode=0 Apr 06 13:49:53 crc kubenswrapper[4790]: I0406 13:49:53.745391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" event={"ID":"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc","Type":"ContainerDied","Data":"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0"} Apr 06 13:49:53 crc kubenswrapper[4790]: I0406 13:49:53.745811 4790 scope.go:117] "RemoveContainer" containerID="881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0" Apr 06 13:49:54 crc kubenswrapper[4790]: I0406 13:49:54.717738 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m5z6z_must-gather-q8hkq_dcd99e98-f90d-4c58-bb22-35f75eaaf6bc/gather/0.log" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.144071 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591390-wlkpl"] Apr 06 13:50:00 crc kubenswrapper[4790]: E0406 13:50:00.145184 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126a281e-40ec-4865-b1ad-c16824233159" containerName="oc" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.145204 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="126a281e-40ec-4865-b1ad-c16824233159" containerName="oc" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.145528 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="126a281e-40ec-4865-b1ad-c16824233159" containerName="oc" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.146534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.150334 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.151023 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.151279 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.153672 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591390-wlkpl"] Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.191063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfqcc\" (UniqueName: \"kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc\") pod \"auto-csr-approver-29591390-wlkpl\" (UID: \"7e6e884f-fb97-47bf-b420-3d4e7f3882a6\") " pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.293716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfqcc\" (UniqueName: \"kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc\") pod \"auto-csr-approver-29591390-wlkpl\" (UID: \"7e6e884f-fb97-47bf-b420-3d4e7f3882a6\") " pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.316540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfqcc\" (UniqueName: \"kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc\") pod \"auto-csr-approver-29591390-wlkpl\" (UID: \"7e6e884f-fb97-47bf-b420-3d4e7f3882a6\") " pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.468718 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:00 crc kubenswrapper[4790]: I0406 13:50:00.975702 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591390-wlkpl"] Apr 06 13:50:01 crc kubenswrapper[4790]: I0406 13:50:01.878993 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" event={"ID":"7e6e884f-fb97-47bf-b420-3d4e7f3882a6","Type":"ContainerStarted","Data":"004911ee0f651f979739d3880764163cb269b261cf8c59f4e5c8ce799dec496a"} Apr 06 13:50:02 crc kubenswrapper[4790]: I0406 13:50:02.898881 4790 generic.go:334] "Generic (PLEG): container finished" podID="7e6e884f-fb97-47bf-b420-3d4e7f3882a6" containerID="c88de67904e6c883403cbbb277711021ffb0cd68344a4e3d1ef622ea24df1e18" exitCode=0 Apr 06 13:50:02 crc kubenswrapper[4790]: I0406 13:50:02.898922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" event={"ID":"7e6e884f-fb97-47bf-b420-3d4e7f3882a6","Type":"ContainerDied","Data":"c88de67904e6c883403cbbb277711021ffb0cd68344a4e3d1ef622ea24df1e18"} Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.297353 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.373757 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfqcc\" (UniqueName: \"kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc\") pod \"7e6e884f-fb97-47bf-b420-3d4e7f3882a6\" (UID: \"7e6e884f-fb97-47bf-b420-3d4e7f3882a6\") " Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.390395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc" (OuterVolumeSpecName: "kube-api-access-hfqcc") pod "7e6e884f-fb97-47bf-b420-3d4e7f3882a6" (UID: "7e6e884f-fb97-47bf-b420-3d4e7f3882a6"). InnerVolumeSpecName "kube-api-access-hfqcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.477470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfqcc\" (UniqueName: \"kubernetes.io/projected/7e6e884f-fb97-47bf-b420-3d4e7f3882a6-kube-api-access-hfqcc\") on node \"crc\" DevicePath \"\"" Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.919947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" event={"ID":"7e6e884f-fb97-47bf-b420-3d4e7f3882a6","Type":"ContainerDied","Data":"004911ee0f651f979739d3880764163cb269b261cf8c59f4e5c8ce799dec496a"} Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.920201 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004911ee0f651f979739d3880764163cb269b261cf8c59f4e5c8ce799dec496a" Apr 06 13:50:04 crc kubenswrapper[4790]: I0406 13:50:04.920014 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591390-wlkpl" Apr 06 13:50:05 crc kubenswrapper[4790]: I0406 13:50:05.409284 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591384-x8m2t"] Apr 06 13:50:05 crc kubenswrapper[4790]: I0406 13:50:05.427708 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591384-x8m2t"] Apr 06 13:50:05 crc kubenswrapper[4790]: I0406 13:50:05.675847 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:50:05 crc kubenswrapper[4790]: E0406 13:50:05.676303 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:50:05 crc kubenswrapper[4790]: I0406 13:50:05.687094 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25452c01-bc42-4b7a-82c4-ea776612c95d" path="/var/lib/kubelet/pods/25452c01-bc42-4b7a-82c4-ea776612c95d/volumes" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.152204 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m5z6z/must-gather-q8hkq"] Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.152787 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="copy" containerID="cri-o://e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b" gracePeriod=2 Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.174280 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m5z6z/must-gather-q8hkq"] Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.727811 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m5z6z_must-gather-q8hkq_dcd99e98-f90d-4c58-bb22-35f75eaaf6bc/copy/0.log" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.728502 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.770967 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4grv\" (UniqueName: \"kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv\") pod \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.771158 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output\") pod \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\" (UID: \"dcd99e98-f90d-4c58-bb22-35f75eaaf6bc\") " Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.778929 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv" (OuterVolumeSpecName: "kube-api-access-c4grv") pod "dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" (UID: "dcd99e98-f90d-4c58-bb22-35f75eaaf6bc"). InnerVolumeSpecName "kube-api-access-c4grv". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.873766 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4grv\" (UniqueName: \"kubernetes.io/projected/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-kube-api-access-c4grv\") on node \"crc\" DevicePath \"\"" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.974518 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m5z6z_must-gather-q8hkq_dcd99e98-f90d-4c58-bb22-35f75eaaf6bc/copy/0.log" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.978300 4790 generic.go:334] "Generic (PLEG): container finished" podID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerID="e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b" exitCode=143 Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.978329 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m5z6z/must-gather-q8hkq" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.978395 4790 scope.go:117] "RemoveContainer" containerID="e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.980933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" (UID: "dcd99e98-f90d-4c58-bb22-35f75eaaf6bc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:50:08 crc kubenswrapper[4790]: I0406 13:50:08.998636 4790 scope.go:117] "RemoveContainer" containerID="881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.039342 4790 scope.go:117] "RemoveContainer" containerID="e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b" Apr 06 13:50:09 crc kubenswrapper[4790]: E0406 13:50:09.039885 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b\": container with ID starting with e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b not found: ID does not exist" containerID="e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.039932 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b"} err="failed to get container status \"e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b\": rpc error: code = NotFound desc = could not find container \"e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b\": container with ID starting with e4cf7b0ecbd86cb39f6dbb05ced9f4bb12c78ee498ada4414e3ba7052bea2c3b not found: ID does not exist" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.039965 4790 scope.go:117] "RemoveContainer" containerID="881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0" Apr 06 13:50:09 crc kubenswrapper[4790]: E0406 13:50:09.040326 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0\": container with ID starting with 881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0 not found: ID does not exist" containerID="881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.040354 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0"} err="failed to get container status \"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0\": rpc error: code = NotFound desc = could not find container \"881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0\": container with ID starting with 881cc72d4f67b6f9be6fa27b6838367082f99d3d80111ae0bdcacd1ce31314d0 not found: ID does not exist" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.076472 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc-must-gather-output\") on node \"crc\" DevicePath \"\"" Apr 06 13:50:09 crc kubenswrapper[4790]: I0406 13:50:09.686349 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" path="/var/lib/kubelet/pods/dcd99e98-f90d-4c58-bb22-35f75eaaf6bc/volumes" Apr 06 13:50:20 crc kubenswrapper[4790]: I0406 13:50:20.676297 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:50:20 crc kubenswrapper[4790]: E0406 13:50:20.677144 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:50:20 crc kubenswrapper[4790]: I0406 13:50:20.945329 4790 scope.go:117] "RemoveContainer" containerID="0a6215a2722cec7fae0bd2cc36646cdca8d6147bc7027d5e6f9a92338dfafbb8" Apr 06 13:50:20 crc kubenswrapper[4790]: I0406 13:50:20.970931 4790 scope.go:117] "RemoveContainer" containerID="3f29bf59f2b6ef3d384ca6d05e8fc05152adad7be6d1447221db42c3d4b8ff94" Apr 06 13:50:21 crc kubenswrapper[4790]: I0406 13:50:21.029018 4790 scope.go:117] "RemoveContainer" containerID="3289cc6977c7de004358d579c1512438c72597bdc9452be6849f9c6c0de5daeb" Apr 06 13:50:32 crc kubenswrapper[4790]: I0406 13:50:32.676045 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:50:32 crc kubenswrapper[4790]: E0406 13:50:32.676769 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:50:43 crc kubenswrapper[4790]: I0406 13:50:43.675839 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:50:43 crc kubenswrapper[4790]: E0406 13:50:43.676894 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:50:54 crc kubenswrapper[4790]: I0406 13:50:54.676793 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:50:54 crc kubenswrapper[4790]: E0406 13:50:54.677937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:51:09 crc kubenswrapper[4790]: I0406 13:51:09.675736 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:51:09 crc kubenswrapper[4790]: E0406 13:51:09.677344 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:51:24 crc kubenswrapper[4790]: I0406 13:51:24.675665 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:51:24 crc kubenswrapper[4790]: E0406 13:51:24.676621 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.669377 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:28 crc kubenswrapper[4790]: E0406 13:51:28.670656 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="gather" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.670674 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="gather" Apr 06 13:51:28 crc kubenswrapper[4790]: E0406 13:51:28.670719 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="copy" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.670728 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="copy" Apr 06 13:51:28 crc kubenswrapper[4790]: E0406 13:51:28.670754 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6e884f-fb97-47bf-b420-3d4e7f3882a6" containerName="oc" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.670764 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6e884f-fb97-47bf-b420-3d4e7f3882a6" containerName="oc" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.671072 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="copy" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.671107 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6e884f-fb97-47bf-b420-3d4e7f3882a6" containerName="oc" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.671136 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd99e98-f90d-4c58-bb22-35f75eaaf6bc" containerName="gather" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.673199 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.703889 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.878350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.878456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.878593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brkc4\" (UniqueName: \"kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.980430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.980528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.980635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brkc4\" (UniqueName: \"kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.980989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:28 crc kubenswrapper[4790]: I0406 13:51:28.981235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:29 crc kubenswrapper[4790]: I0406 13:51:29.002631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brkc4\" (UniqueName: \"kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4\") pod \"redhat-operators-n66tz\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:29 crc kubenswrapper[4790]: I0406 13:51:29.297714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:29 crc kubenswrapper[4790]: I0406 13:51:29.771049 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:29 crc kubenswrapper[4790]: W0406 13:51:29.775820 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ec0316_c1e5_4e9f_bb5a_b51cad5da818.slice/crio-b7f9ef2312ae553d34214e94f63af5f201d6671114a1b4803d8809b9ca14e46a WatchSource:0}: Error finding container b7f9ef2312ae553d34214e94f63af5f201d6671114a1b4803d8809b9ca14e46a: Status 404 returned error can't find the container with id b7f9ef2312ae553d34214e94f63af5f201d6671114a1b4803d8809b9ca14e46a Apr 06 13:51:29 crc kubenswrapper[4790]: I0406 13:51:29.977290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerStarted","Data":"b7f9ef2312ae553d34214e94f63af5f201d6671114a1b4803d8809b9ca14e46a"} Apr 06 13:51:31 crc kubenswrapper[4790]: I0406 13:51:31.002323 4790 generic.go:334] "Generic (PLEG): container finished" podID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerID="940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598" exitCode=0 Apr 06 13:51:31 crc kubenswrapper[4790]: I0406 13:51:31.002411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerDied","Data":"940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598"} Apr 06 13:51:31 crc kubenswrapper[4790]: I0406 13:51:31.006021 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 06 13:51:33 crc kubenswrapper[4790]: I0406 13:51:33.033016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerStarted","Data":"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc"} Apr 06 13:51:34 crc kubenswrapper[4790]: I0406 13:51:34.049898 4790 generic.go:334] "Generic (PLEG): container finished" podID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerID="3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc" exitCode=0 Apr 06 13:51:34 crc kubenswrapper[4790]: I0406 13:51:34.049957 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerDied","Data":"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc"} Apr 06 13:51:35 crc kubenswrapper[4790]: I0406 13:51:35.065284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerStarted","Data":"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4"} Apr 06 13:51:35 crc kubenswrapper[4790]: I0406 13:51:35.093280 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n66tz" podStartSLOduration=3.637159033 podStartE2EDuration="7.09326188s" podCreationTimestamp="2026-04-06 13:51:28 +0000 UTC" firstStartedPulling="2026-04-06 13:51:31.005555951 +0000 UTC m=+6869.993298827" lastFinishedPulling="2026-04-06 13:51:34.461658808 +0000 UTC m=+6873.449401674" observedRunningTime="2026-04-06 13:51:35.083784667 +0000 UTC m=+6874.071527553" watchObservedRunningTime="2026-04-06 13:51:35.09326188 +0000 UTC m=+6874.081004736" Apr 06 13:51:38 crc kubenswrapper[4790]: I0406 13:51:38.676084 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:51:38 crc kubenswrapper[4790]: E0406 13:51:38.676934 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:51:39 crc kubenswrapper[4790]: I0406 13:51:39.298305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:39 crc kubenswrapper[4790]: I0406 13:51:39.298379 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:40 crc kubenswrapper[4790]: I0406 13:51:40.345663 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n66tz" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="registry-server" probeResult="failure" output=< Apr 06 13:51:40 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Apr 06 13:51:40 crc kubenswrapper[4790]: > Apr 06 13:51:49 crc kubenswrapper[4790]: I0406 13:51:49.354958 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:49 crc kubenswrapper[4790]: I0406 13:51:49.404013 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:49 crc kubenswrapper[4790]: I0406 13:51:49.591599 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.248323 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n66tz" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="registry-server" containerID="cri-o://b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4" gracePeriod=2 Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.760248 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.951505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brkc4\" (UniqueName: \"kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4\") pod \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.951688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities\") pod \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.951786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content\") pod \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\" (UID: \"65ec0316-c1e5-4e9f-bb5a-b51cad5da818\") " Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.952654 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities" (OuterVolumeSpecName: "utilities") pod "65ec0316-c1e5-4e9f-bb5a-b51cad5da818" (UID: "65ec0316-c1e5-4e9f-bb5a-b51cad5da818"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:51:51 crc kubenswrapper[4790]: I0406 13:51:51.959147 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4" (OuterVolumeSpecName: "kube-api-access-brkc4") pod "65ec0316-c1e5-4e9f-bb5a-b51cad5da818" (UID: "65ec0316-c1e5-4e9f-bb5a-b51cad5da818"). InnerVolumeSpecName "kube-api-access-brkc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.054261 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brkc4\" (UniqueName: \"kubernetes.io/projected/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-kube-api-access-brkc4\") on node \"crc\" DevicePath \"\"" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.054302 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.105029 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65ec0316-c1e5-4e9f-bb5a-b51cad5da818" (UID: "65ec0316-c1e5-4e9f-bb5a-b51cad5da818"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.156657 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ec0316-c1e5-4e9f-bb5a-b51cad5da818-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.261314 4790 generic.go:334] "Generic (PLEG): container finished" podID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerID="b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4" exitCode=0 Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.261353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerDied","Data":"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4"} Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.261376 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n66tz" event={"ID":"65ec0316-c1e5-4e9f-bb5a-b51cad5da818","Type":"ContainerDied","Data":"b7f9ef2312ae553d34214e94f63af5f201d6671114a1b4803d8809b9ca14e46a"} Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.261393 4790 scope.go:117] "RemoveContainer" containerID="b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.261399 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n66tz" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.305275 4790 scope.go:117] "RemoveContainer" containerID="3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.305679 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.328568 4790 scope.go:117] "RemoveContainer" containerID="940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.331788 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n66tz"] Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.370590 4790 scope.go:117] "RemoveContainer" containerID="b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4" Apr 06 13:51:52 crc kubenswrapper[4790]: E0406 13:51:52.371084 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4\": container with ID starting with b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4 not found: ID does not exist" containerID="b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.371114 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4"} err="failed to get container status \"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4\": rpc error: code = NotFound desc = could not find container \"b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4\": container with ID starting with b9949dcd535a1ba403f7691dc9cee45202405d402fcb99f6483d0860089441f4 not found: ID does not exist" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.371136 4790 scope.go:117] "RemoveContainer" containerID="3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc" Apr 06 13:51:52 crc kubenswrapper[4790]: E0406 13:51:52.371362 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc\": container with ID starting with 3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc not found: ID does not exist" containerID="3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.371386 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc"} err="failed to get container status \"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc\": rpc error: code = NotFound desc = could not find container \"3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc\": container with ID starting with 3717196e0be8ec30bdc109794655bb83cff53aac6385e452c9104325274117fc not found: ID does not exist" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.371399 4790 scope.go:117] "RemoveContainer" containerID="940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598" Apr 06 13:51:52 crc kubenswrapper[4790]: E0406 13:51:52.371671 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598\": container with ID starting with 940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598 not found: ID does not exist" containerID="940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.371690 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598"} err="failed to get container status \"940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598\": rpc error: code = NotFound desc = could not find container \"940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598\": container with ID starting with 940e8905a9d3560d1c879535e6dacec47fcc8e2515227c90100f886b51664598 not found: ID does not exist" Apr 06 13:51:52 crc kubenswrapper[4790]: I0406 13:51:52.676141 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:51:52 crc kubenswrapper[4790]: E0406 13:51:52.677044 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:51:53 crc kubenswrapper[4790]: I0406 13:51:53.686486 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" path="/var/lib/kubelet/pods/65ec0316-c1e5-4e9f-bb5a-b51cad5da818/volumes" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.166399 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591392-f9sxr"] Apr 06 13:52:00 crc kubenswrapper[4790]: E0406 13:52:00.168100 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="extract-content" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.168134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="extract-content" Apr 06 13:52:00 crc kubenswrapper[4790]: E0406 13:52:00.168168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="registry-server" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.168184 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="registry-server" Apr 06 13:52:00 crc kubenswrapper[4790]: E0406 13:52:00.168233 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="extract-utilities" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.168251 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="extract-utilities" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.168788 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ec0316-c1e5-4e9f-bb5a-b51cad5da818" containerName="registry-server" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.170309 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.172503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.172569 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.173095 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.176282 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591392-f9sxr"] Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.235170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhz4\" (UniqueName: \"kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4\") pod \"auto-csr-approver-29591392-f9sxr\" (UID: \"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8\") " pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.336545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhz4\" (UniqueName: \"kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4\") pod \"auto-csr-approver-29591392-f9sxr\" (UID: \"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8\") " pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.355100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhz4\" (UniqueName: \"kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4\") pod \"auto-csr-approver-29591392-f9sxr\" (UID: \"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8\") " pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:00 crc kubenswrapper[4790]: I0406 13:52:00.494744 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:01 crc kubenswrapper[4790]: I0406 13:52:01.019325 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591392-f9sxr"] Apr 06 13:52:01 crc kubenswrapper[4790]: I0406 13:52:01.361196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" event={"ID":"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8","Type":"ContainerStarted","Data":"48144f85f524ccfc6acb285ec86eb153c5459aa67ad26ea5fc654ac06aaba267"} Apr 06 13:52:02 crc kubenswrapper[4790]: I0406 13:52:02.373418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" event={"ID":"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8","Type":"ContainerStarted","Data":"f839a117f237836604b37c5aa52dc1996a12479719362080b2331a77b00138c3"} Apr 06 13:52:02 crc kubenswrapper[4790]: I0406 13:52:02.393232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" podStartSLOduration=1.42527389 podStartE2EDuration="2.393208941s" podCreationTimestamp="2026-04-06 13:52:00 +0000 UTC" firstStartedPulling="2026-04-06 13:52:01.008753807 +0000 UTC m=+6899.996496673" lastFinishedPulling="2026-04-06 13:52:01.976688838 +0000 UTC m=+6900.964431724" observedRunningTime="2026-04-06 13:52:02.38903486 +0000 UTC m=+6901.376777736" watchObservedRunningTime="2026-04-06 13:52:02.393208941 +0000 UTC m=+6901.380951817" Apr 06 13:52:03 crc kubenswrapper[4790]: I0406 13:52:03.386064 4790 generic.go:334] "Generic (PLEG): container finished" podID="923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8" containerID="f839a117f237836604b37c5aa52dc1996a12479719362080b2331a77b00138c3" exitCode=0 Apr 06 13:52:03 crc kubenswrapper[4790]: I0406 13:52:03.386218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" event={"ID":"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8","Type":"ContainerDied","Data":"f839a117f237836604b37c5aa52dc1996a12479719362080b2331a77b00138c3"} Apr 06 13:52:03 crc kubenswrapper[4790]: I0406 13:52:03.676384 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:52:03 crc kubenswrapper[4790]: E0406 13:52:03.676659 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:52:04 crc kubenswrapper[4790]: I0406 13:52:04.733100 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:04 crc kubenswrapper[4790]: I0406 13:52:04.832142 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwhz4\" (UniqueName: \"kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4\") pod \"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8\" (UID: \"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8\") " Apr 06 13:52:04 crc kubenswrapper[4790]: I0406 13:52:04.837562 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4" (OuterVolumeSpecName: "kube-api-access-nwhz4") pod "923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8" (UID: "923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8"). InnerVolumeSpecName "kube-api-access-nwhz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:52:04 crc kubenswrapper[4790]: I0406 13:52:04.934458 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwhz4\" (UniqueName: \"kubernetes.io/projected/923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8-kube-api-access-nwhz4\") on node \"crc\" DevicePath \"\"" Apr 06 13:52:05 crc kubenswrapper[4790]: I0406 13:52:05.408670 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" event={"ID":"923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8","Type":"ContainerDied","Data":"48144f85f524ccfc6acb285ec86eb153c5459aa67ad26ea5fc654ac06aaba267"} Apr 06 13:52:05 crc kubenswrapper[4790]: I0406 13:52:05.408711 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48144f85f524ccfc6acb285ec86eb153c5459aa67ad26ea5fc654ac06aaba267" Apr 06 13:52:05 crc kubenswrapper[4790]: I0406 13:52:05.408729 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591392-f9sxr" Apr 06 13:52:05 crc kubenswrapper[4790]: I0406 13:52:05.805878 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591386-hnf6h"] Apr 06 13:52:05 crc kubenswrapper[4790]: I0406 13:52:05.818221 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591386-hnf6h"] Apr 06 13:52:07 crc kubenswrapper[4790]: I0406 13:52:07.690443 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52809e0-c4d5-4f5a-969c-57f8ca1e5d98" path="/var/lib/kubelet/pods/f52809e0-c4d5-4f5a-969c-57f8ca1e5d98/volumes" Apr 06 13:52:16 crc kubenswrapper[4790]: I0406 13:52:16.675265 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:52:16 crc kubenswrapper[4790]: E0406 13:52:16.676138 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:52:21 crc kubenswrapper[4790]: I0406 13:52:21.240249 4790 scope.go:117] "RemoveContainer" containerID="7aa14fed21e429a86acd22cfecb5bd1146eb1b8035329790e0befa9bc5cca839" Apr 06 13:52:31 crc kubenswrapper[4790]: I0406 13:52:31.697042 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:52:31 crc kubenswrapper[4790]: E0406 13:52:31.698005 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:52:45 crc kubenswrapper[4790]: I0406 13:52:45.675707 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:52:45 crc kubenswrapper[4790]: E0406 13:52:45.676757 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:53:00 crc kubenswrapper[4790]: I0406 13:53:00.675770 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:53:00 crc kubenswrapper[4790]: E0406 13:53:00.676542 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:53:13 crc kubenswrapper[4790]: I0406 13:53:13.676596 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:53:13 crc kubenswrapper[4790]: E0406 13:53:13.677383 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:53:25 crc kubenswrapper[4790]: I0406 13:53:25.675122 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:53:25 crc kubenswrapper[4790]: E0406 13:53:25.676926 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:53:40 crc kubenswrapper[4790]: I0406 13:53:40.675251 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:53:40 crc kubenswrapper[4790]: E0406 13:53:40.676003 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:53:55 crc kubenswrapper[4790]: I0406 13:53:55.676206 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:53:55 crc kubenswrapper[4790]: E0406 13:53:55.677273 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.162739 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29591394-fdb5w"] Apr 06 13:54:00 crc kubenswrapper[4790]: E0406 13:54:00.164592 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8" containerName="oc" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.164675 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8" containerName="oc" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.164942 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="923b9ed9-22e3-470f-bcb6-6ccbf7e41eb8" containerName="oc" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.165696 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.167716 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jbdt6" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.169485 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.172134 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.191990 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591394-fdb5w"] Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.299595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq8x\" (UniqueName: \"kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x\") pod \"auto-csr-approver-29591394-fdb5w\" (UID: \"e37b5311-c1c9-47a7-b96c-186e2ce184c4\") " pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.402231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dq8x\" (UniqueName: \"kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x\") pod \"auto-csr-approver-29591394-fdb5w\" (UID: \"e37b5311-c1c9-47a7-b96c-186e2ce184c4\") " pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.421854 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dq8x\" (UniqueName: \"kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x\") pod \"auto-csr-approver-29591394-fdb5w\" (UID: \"e37b5311-c1c9-47a7-b96c-186e2ce184c4\") " pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:00 crc kubenswrapper[4790]: I0406 13:54:00.485083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:01 crc kubenswrapper[4790]: I0406 13:54:01.001810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29591394-fdb5w"] Apr 06 13:54:01 crc kubenswrapper[4790]: I0406 13:54:01.690815 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" event={"ID":"e37b5311-c1c9-47a7-b96c-186e2ce184c4","Type":"ContainerStarted","Data":"37503b3b8a7f89d738b9dc324c35407b8fdb0f227e2417171c1afd100a56fef5"} Apr 06 13:54:02 crc kubenswrapper[4790]: I0406 13:54:02.702728 4790 generic.go:334] "Generic (PLEG): container finished" podID="e37b5311-c1c9-47a7-b96c-186e2ce184c4" containerID="708445a07dc73d4da81bcbdb184643cc7015c56618fec30872fad60d283ef2dd" exitCode=0 Apr 06 13:54:02 crc kubenswrapper[4790]: I0406 13:54:02.702842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" event={"ID":"e37b5311-c1c9-47a7-b96c-186e2ce184c4","Type":"ContainerDied","Data":"708445a07dc73d4da81bcbdb184643cc7015c56618fec30872fad60d283ef2dd"} Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.065643 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.187676 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dq8x\" (UniqueName: \"kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x\") pod \"e37b5311-c1c9-47a7-b96c-186e2ce184c4\" (UID: \"e37b5311-c1c9-47a7-b96c-186e2ce184c4\") " Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.197090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x" (OuterVolumeSpecName: "kube-api-access-6dq8x") pod "e37b5311-c1c9-47a7-b96c-186e2ce184c4" (UID: "e37b5311-c1c9-47a7-b96c-186e2ce184c4"). InnerVolumeSpecName "kube-api-access-6dq8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.290839 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dq8x\" (UniqueName: \"kubernetes.io/projected/e37b5311-c1c9-47a7-b96c-186e2ce184c4-kube-api-access-6dq8x\") on node \"crc\" DevicePath \"\"" Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.728200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" event={"ID":"e37b5311-c1c9-47a7-b96c-186e2ce184c4","Type":"ContainerDied","Data":"37503b3b8a7f89d738b9dc324c35407b8fdb0f227e2417171c1afd100a56fef5"} Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.728245 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37503b3b8a7f89d738b9dc324c35407b8fdb0f227e2417171c1afd100a56fef5" Apr 06 13:54:04 crc kubenswrapper[4790]: I0406 13:54:04.728248 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29591394-fdb5w" Apr 06 13:54:05 crc kubenswrapper[4790]: I0406 13:54:05.163820 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29591388-jxwld"] Apr 06 13:54:05 crc kubenswrapper[4790]: I0406 13:54:05.180908 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29591388-jxwld"] Apr 06 13:54:05 crc kubenswrapper[4790]: I0406 13:54:05.693068 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126a281e-40ec-4865-b1ad-c16824233159" path="/var/lib/kubelet/pods/126a281e-40ec-4865-b1ad-c16824233159/volumes" Apr 06 13:54:06 crc kubenswrapper[4790]: I0406 13:54:06.677322 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:54:06 crc kubenswrapper[4790]: E0406 13:54:06.677974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:54:19 crc kubenswrapper[4790]: I0406 13:54:19.675643 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:54:19 crc kubenswrapper[4790]: E0406 13:54:19.676500 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:54:21 crc kubenswrapper[4790]: I0406 13:54:21.370882 4790 scope.go:117] "RemoveContainer" containerID="0b0059157d34aafc699c266640ef03c679786cf7569cec5994a5782fb31ea501" Apr 06 13:54:30 crc kubenswrapper[4790]: I0406 13:54:30.675995 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:54:30 crc kubenswrapper[4790]: E0406 13:54:30.676711 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9p96t_openshift-machine-config-operator(9f5e33f8-0490-4219-8c40-526903de8e6f)\"" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" podUID="9f5e33f8-0490-4219-8c40-526903de8e6f" Apr 06 13:54:45 crc kubenswrapper[4790]: I0406 13:54:45.676257 4790 scope.go:117] "RemoveContainer" containerID="b33762c0948fd258eeb685a878adec43f69465d24952e9f935573f3de051f5fe" Apr 06 13:54:46 crc kubenswrapper[4790]: I0406 13:54:46.223071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9p96t" event={"ID":"9f5e33f8-0490-4219-8c40-526903de8e6f","Type":"ContainerStarted","Data":"9fbee3c4c2aea2ca1991ad92cbf22b10a6a42c145dad2e29be44cb0e3fbf3be0"} Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.452316 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:09 crc kubenswrapper[4790]: E0406 13:55:09.453180 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37b5311-c1c9-47a7-b96c-186e2ce184c4" containerName="oc" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.453196 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37b5311-c1c9-47a7-b96c-186e2ce184c4" containerName="oc" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.453491 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37b5311-c1c9-47a7-b96c-186e2ce184c4" containerName="oc" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.455219 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.468960 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.570900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qlr\" (UniqueName: \"kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.570999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.571124 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.673208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.673287 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.673372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qlr\" (UniqueName: \"kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.674250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.674460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.705259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qlr\" (UniqueName: \"kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr\") pod \"certified-operators-45hms\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:09 crc kubenswrapper[4790]: I0406 13:55:09.774541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:10 crc kubenswrapper[4790]: I0406 13:55:10.290054 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:10 crc kubenswrapper[4790]: I0406 13:55:10.493533 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerStarted","Data":"87513a5fedd05924842ec2e48418bcf9daa7849648b25407a96ac6d4b65dec9e"} Apr 06 13:55:10 crc kubenswrapper[4790]: E0406 13:55:10.726321 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8cf360_ea90_4248_a58b_fb5ad0857d67.slice/crio-44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8cf360_ea90_4248_a58b_fb5ad0857d67.slice/crio-conmon-44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91.scope\": RecentStats: unable to find data in memory cache]" Apr 06 13:55:11 crc kubenswrapper[4790]: I0406 13:55:11.513120 4790 generic.go:334] "Generic (PLEG): container finished" podID="db8cf360-ea90-4248-a58b-fb5ad0857d67" containerID="44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91" exitCode=0 Apr 06 13:55:11 crc kubenswrapper[4790]: I0406 13:55:11.513386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerDied","Data":"44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91"} Apr 06 13:55:12 crc kubenswrapper[4790]: I0406 13:55:12.525739 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerStarted","Data":"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8"} Apr 06 13:55:14 crc kubenswrapper[4790]: I0406 13:55:14.546486 4790 generic.go:334] "Generic (PLEG): container finished" podID="db8cf360-ea90-4248-a58b-fb5ad0857d67" containerID="365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8" exitCode=0 Apr 06 13:55:14 crc kubenswrapper[4790]: I0406 13:55:14.546671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerDied","Data":"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8"} Apr 06 13:55:15 crc kubenswrapper[4790]: I0406 13:55:15.558398 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerStarted","Data":"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7"} Apr 06 13:55:15 crc kubenswrapper[4790]: I0406 13:55:15.580424 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-45hms" podStartSLOduration=3.226458226 podStartE2EDuration="6.580409242s" podCreationTimestamp="2026-04-06 13:55:09 +0000 UTC" firstStartedPulling="2026-04-06 13:55:11.517801386 +0000 UTC m=+7090.505544252" lastFinishedPulling="2026-04-06 13:55:14.871752402 +0000 UTC m=+7093.859495268" observedRunningTime="2026-04-06 13:55:15.577597447 +0000 UTC m=+7094.565340313" watchObservedRunningTime="2026-04-06 13:55:15.580409242 +0000 UTC m=+7094.568152108" Apr 06 13:55:19 crc kubenswrapper[4790]: I0406 13:55:19.774974 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:19 crc kubenswrapper[4790]: I0406 13:55:19.775465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:19 crc kubenswrapper[4790]: I0406 13:55:19.858823 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:20 crc kubenswrapper[4790]: I0406 13:55:20.672584 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:20 crc kubenswrapper[4790]: I0406 13:55:20.730380 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:22 crc kubenswrapper[4790]: I0406 13:55:22.648247 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-45hms" podUID="db8cf360-ea90-4248-a58b-fb5ad0857d67" containerName="registry-server" containerID="cri-o://c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7" gracePeriod=2 Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.151033 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.281698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content\") pod \"db8cf360-ea90-4248-a58b-fb5ad0857d67\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.281854 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities\") pod \"db8cf360-ea90-4248-a58b-fb5ad0857d67\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.281894 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qlr\" (UniqueName: \"kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr\") pod \"db8cf360-ea90-4248-a58b-fb5ad0857d67\" (UID: \"db8cf360-ea90-4248-a58b-fb5ad0857d67\") " Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.284041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities" (OuterVolumeSpecName: "utilities") pod "db8cf360-ea90-4248-a58b-fb5ad0857d67" (UID: "db8cf360-ea90-4248-a58b-fb5ad0857d67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.287718 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr" (OuterVolumeSpecName: "kube-api-access-f2qlr") pod "db8cf360-ea90-4248-a58b-fb5ad0857d67" (UID: "db8cf360-ea90-4248-a58b-fb5ad0857d67"). InnerVolumeSpecName "kube-api-access-f2qlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.362682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db8cf360-ea90-4248-a58b-fb5ad0857d67" (UID: "db8cf360-ea90-4248-a58b-fb5ad0857d67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.384108 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-utilities\") on node \"crc\" DevicePath \"\"" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.384141 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2qlr\" (UniqueName: \"kubernetes.io/projected/db8cf360-ea90-4248-a58b-fb5ad0857d67-kube-api-access-f2qlr\") on node \"crc\" DevicePath \"\"" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.384153 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db8cf360-ea90-4248-a58b-fb5ad0857d67-catalog-content\") on node \"crc\" DevicePath \"\"" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.665723 4790 generic.go:334] "Generic (PLEG): container finished" podID="db8cf360-ea90-4248-a58b-fb5ad0857d67" containerID="c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7" exitCode=0 Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.665864 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45hms" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.665804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerDied","Data":"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7"} Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.665989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45hms" event={"ID":"db8cf360-ea90-4248-a58b-fb5ad0857d67","Type":"ContainerDied","Data":"87513a5fedd05924842ec2e48418bcf9daa7849648b25407a96ac6d4b65dec9e"} Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.666034 4790 scope.go:117] "RemoveContainer" containerID="c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.698382 4790 scope.go:117] "RemoveContainer" containerID="365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.726282 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.732902 4790 scope.go:117] "RemoveContainer" containerID="44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.757284 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-45hms"] Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.792940 4790 scope.go:117] "RemoveContainer" containerID="c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7" Apr 06 13:55:23 crc kubenswrapper[4790]: E0406 13:55:23.793438 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7\": container with ID starting with c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7 not found: ID does not exist" containerID="c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.793479 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7"} err="failed to get container status \"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7\": rpc error: code = NotFound desc = could not find container \"c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7\": container with ID starting with c9d4b995f781e9efbd19485d0431fa65df0418ddd5e8dffcfc52a15ce54ea7c7 not found: ID does not exist" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.793502 4790 scope.go:117] "RemoveContainer" containerID="365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8" Apr 06 13:55:23 crc kubenswrapper[4790]: E0406 13:55:23.794212 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8\": container with ID starting with 365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8 not found: ID does not exist" containerID="365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.794242 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8"} err="failed to get container status \"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8\": rpc error: code = NotFound desc = could not find container \"365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8\": container with ID starting with 365f6857c75d6a7a5169dca4d5df0607ce8bad78d0f7b33adfab64e55c068fb8 not found: ID does not exist" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.794263 4790 scope.go:117] "RemoveContainer" containerID="44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91" Apr 06 13:55:23 crc kubenswrapper[4790]: E0406 13:55:23.794526 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91\": container with ID starting with 44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91 not found: ID does not exist" containerID="44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91" Apr 06 13:55:23 crc kubenswrapper[4790]: I0406 13:55:23.794556 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91"} err="failed to get container status \"44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91\": rpc error: code = NotFound desc = could not find container \"44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91\": container with ID starting with 44ea7d08b8f1ecf466ca27999d4272131f194271e2c8dfed2e8a00e885d97d91 not found: ID does not exist"